Effective Sensor Fusion of a Mobile Robot for SLAM Implementation

Toroslu I., Doğan M.

4th International Conference on Control, Automation and Robotics (ICCAR), Auckland, New Zealand, 20 - 23 April 2018, pp.76-81 identifier

  • Publication Type: Conference Paper / Full Text
  • City: Auckland
  • Country: New Zealand
  • Page Numbers: pp.76-81
  • Istanbul Technical University Affiliated: Yes


The Simultaneous Localization and Mapping (SLAM) is the process of building a map of an environment with an unknown topography by a mobile robot. The purpose of this paper is to build a mapping of an unknown environment by the mobile robot which we designed through the help of sensor fusion algorithms we have established. The mobile robot performs its mapping process by using the combination of ultrasonic, optical encoder and IMU sensors. Determining the position of the obstacles and its own location, for the mobile robot, is the core of this study. Inertial and rotational sensors are utilized to calculate the distance and position of the mobile robot. Due to low cost, the ultrasonic sensor is used instead of a Lidar laser, and the real-like results were provided. In this study, the robot's direction and movement is performed by an algorithm developed on the Raspberry Pi processor. This algorithm controls the movement of the wheels with the information received from the optical encoder and protractor. The data received from the gyroscope and the accelerometer is very affected from many external factors such as vibrational motion and the noise, eventhough, we used moving average filter and complementary filter to reduce the effect of the noise and measurement error problems. However, they still produce faulty results when calculating distance values. Therefore, the distance computation is carried out by using optical encoder instead of the accelerometer. The algorithm of the distance computation is written in Python programming language. In this study, it is established that the comparative usage of several detectors provide more accurate results. At the same time, the system is quite efficiently developed by using open structure software (Raspberry Pi, Linux etc.) and writing authentic libraries. The robot's coordinate information are combined under simulation medium by using Pygame library and by computing the coordinates of its location and the coordinates of the objects' locations it detects during its navigation. The mobile robot executes its mapping process according to these data derived. Also, the effects of margin of error in the information obtained during the comparable usage of detectors are studied within the scope of this study.