The four wheeled mobile robot with the autonomous navigation system in ROS

May 7, 2016
Posted by Michał Drwięga

 

The presented mobile robot was equipped with the autonomous navigation system in ROS framework, mainly based on data from Microsoft Kinect sensor. The robot has four independently driven wheels and is rather small. The width and length are about 35 cm, hight with Kinect sensor is below 50 cm. The pictures shows the real robot and visualizations of it.
robot_img1m

robot_vis2_min

robot3_v2min

The first version of robot without autonomous navigation system was described in following page: Four wheeled mobile robot with Raspberry Pi. It was equipped with a Raspberry Pi computer, the set of sensors and four DC brushed motors from Pololu.

robot_interior

 

For the autonomous system purposes the mobile platform was expanded by an additional computer (temporary a notebook with Intel i3 and 4 GB RAM) and the Microsoft Kinect sensor with mounting elements. The diagram shows hardware architecture of robot.

tacbot_hardware_architecture

 

The architecture of high-level control system of mobile robot

 

The control system of robot has layered structure and was realized with a few computations units. The one of requirements for low-level control system were real time operations. So, all time-sensitive algorithms were running in microcontroller Kinetis. Many of used solutions were described in previous entry. The high-level part of control system doesn’t need to fulfill hard real-time requirements so the non-realtime interfaces were used like a ethernet or a USB. The below diagram shows the architecture of control system.

tacbot_architecture_control

 

The higher-level layer of control system was realized in ROS Indigo framework. This development environment provides mechanizms for software nodes communication and synchronization. Moreover, ROS allows to easy run control algorithms on several computers. So, part of control system was run on Raspberry Pi and the other part on notebook. Components was realized as nodes which communicate with each others by mechanizm called topics. The control system consists of following components:

  • PS3 Pad teleoperation
  • Keyboard teleoperation
  • Velocities filter
  • Velocities multiplexer
  • User application with rviz
  • Localization filter
  • Hardware driver
  • Kinect Openni
  • Navigation
  • Laser scan from Kinect
PS3 Pad teleoperation

The component allows to manual control of mobile robot with Sony DualShock 3 pad. It was implemented deadman button for safety purposes.

 

Keyboard teleoperation

The component allows to control platform with defined keyboard keys. It is realized as a Python script where keys function can be determined.

 

Velocities multiplexer

The velocities multiplexer component allows to switch between velocities commands sources for example autonomous navigation or manual control.

Velocities filter

The component filters velocities send to low-level controller. It allows to bound a velocities and accelerations of mobile platform.

 

Hardware driver

The dedicated driver for hardware components of robot. It get data from sensors module, motors controller and publishes they to ROS topics. The functions of driver are like following:

  • sending velocities commands from topic /cmd_vel to low-level motors controller,
  • getting odometry data which contains relative position and orientation of robot from low-level controller. Then it publishes that data on topic /wheels_odom,
  • downloading data from sensors module what mean from inertial sensor, magnetometers and proximity sensors. Also publishing data to following topics: /imu/raw_data/imu_mag.

The component is running on Raspberry Pi and communication with lower level components based on I2C interface. It was written in C++ and it was used bcm2835 library for processor Broadcom BCM 2835.

 

User application with rviz

The software allows to control platform by user. It is based on ROS visualization tool — rviz. It displays robot surrounding map, image from Kinect RGB camera and provides features to set autonomous navigation goals.

 

Localization filter

To improve odometry localization results it was used inertial measurement unit (IMU). The data fusion based on Extended Kalman Filter (EKF).

 

Kinect Openni

The OpenNI driver for Kinect. It allows to get from Kinect sensor data like a depth image or RGB image. The data are publishes to specific topics.

 

Laser scan from Kinect

The software converts a depth image from Kinect sensor to 2D laser scan message (sensor_msgs/LaserScan). Additionally it remove ground from depth image and compensates  tilt angle of sensor. The component was published as part of a depth_nav_tools package on ROS page.

Navigation

Reconfigured standard ROS package for navigation tasks.

 

Tests of autonomous navigation system

The below pictures shows graphical interface which visualize mobile robot and map of surrounding.

4wd_map_p14wd_map_p2209_kinect

 

The conducted tests shows that it is possible to create map and autonomous navigation only with the Microsoft Kinect sensor. But it should be noticed that Kinect sensor has some drawbacks in that application like a small field of view angles and not sufficient range. It makes that robot has sometimes problems with properly localization on the map.

Tags: , , , ,

Leave a Reply

Your email address will not be published. Required fields are marked *

  • Polski
  • English