Tracking a Mobile Robot Position Using Vision and Inertial Sensor

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Citations (Scopus)

Abstract

Wheeled mobile robots are still the first choice when it comes to industrial or domotic applications. The robot's navigation system aims to reliably determine the robot's position, velocity and orientation and provide it to control and trajectory guidance modules. The most frequently used sensors are inertial measurement units (IMU) combined with an absolute position sensing mechanism. The dead reckoning approach using IMU suffers from integration drift due to noise and bias. To overcome this limitation we propose the use of the inertial system in combination with mechanical odometers and a vision based system. These two sensor complement each other as the vision sensor is accurate at low-velocities but requires long computation time, while the inertial sensor is able to track fast movements but suffers from drift. The information from the sensors is integrated through a multi-rate fusion scheme. Each of the sensor systems is assumed to have it's own independent sampling rate, which may be time-varying. Data fusion is performed by a multi-rate Kalman filter. The paper describes the inertial and vision navigation systems, and the data fusion algorithm. Simulation and experimental results are presented.
Original languageEnglish
Title of host publicationIFIP Advances in Information and Communication Technology
Pages201-208
Volume423
ISBN (Electronic)978-3-642-54734-8
DOIs
Publication statusPublished - 2014
Event5th IFIP WG 5.5/SOCOLNET Doctoral Conference on Computing, Electrical and Industrial Systems (DoCEIS) -
Duration: 1 Jan 2014 → …

Conference

Conference5th IFIP WG 5.5/SOCOLNET Doctoral Conference on Computing, Electrical and Industrial Systems (DoCEIS)
Period1/01/14 → …

Keywords

  • inertial sensor
  • Mobile robotics
  • multi-rate sampling
  • sensor fusion
  • vision

Fingerprint Dive into the research topics of 'Tracking a Mobile Robot Position Using Vision and Inertial Sensor'. Together they form a unique fingerprint.

Cite this