Abstract
This paper presents a ground vehicle capable of exploiting haptic cues to learn navigation affordances from depth cues. A simple pan-tilt telescopic antenna and a Kinect sensor, both fitted to the robot's body frame, provide the required haptic and depth sensory feedback, respectively. With the antenna, the robot determines whether an object is traversable by the robot. Then, the interaction outcome is associated to the object's depth-based descriptor. Later on, the robot to predict if a newly observed object is traversable just by inspecting its depth-based appearance uses this acquired knowledge. A set of field trials show the ability of the to robot progressively learn which elements of the environment are traversable.
Original language | English |
---|---|
Title of host publication | 2014 IEEE International Conference on Autonomous Robot Systems and Competitions, ICARSC |
Editors | N Lau, AP Moreira, R Ventura, BM Faria |
Publisher | Institute of Electrical and Electronics Engineers (IEEE) |
Pages | 146-151 |
Number of pages | 6 |
DOIs | |
Publication status | Published - 2014 |
Event | IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC) - Espinho, Portugal Duration: 14 May 2014 → 15 May 2014 |
Conference
Conference | IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC) |
---|---|
Country/Territory | Portugal |
City | Espinho |
Period | 14/05/14 → 15/05/14 |
Keywords
- affordances
- autonomous robots
- depth sensing
- robotic antenna
- self-supervised learning
- terrain assessment