Terrain classification using w-k filter and 3d navigation with static collision avoidance

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

9 Citations (Scopus)

Abstract

The ability to autonomously navigate in an unknown and dynamic environment avoiding obstacles while, at the same time, classify various types of terrain are challenges that have been mainly faced by researchers from the computer vision area. Solutions to these problems are of great interest for collaborative autonomous navigation robots. For example an Unmanned Aerial Vehicle (UAV) may be used to determine the path which an Unmanned Surface Vehicle (USV) has to navigate to reach the intended destination. This paper presents a novel vision based algorithm that allows for independent navigation with on flight obstacle avoidance planning, while simultaneously classifying different terrain type using a Wiener-Khinchin (W-K) Filter.

Original languageEnglish
Title of host publicationIntelligent Systems and Applications - Proceedings of the 2019 Intelligent Systems Conference IntelliSys Volume 2
EditorsYaxin Bi, Rahul Bhatia, Supriya Kapoor
Place of PublicationCham
PublisherSpringer
Pages1122-1137
Number of pages16
ISBN (Electronic)978-3-030-29513-4
ISBN (Print)978-3-030-29512-7
DOIs
Publication statusPublished - 2020
EventIntelligent Systems Conference, IntelliSys 2019 - London, United Kingdom
Duration: 5 Sept 20196 Sept 2019

Publication series

NameAdvances in Intelligent Systems and Computing
PublisherSpringer
Volume1038
ISSN (Print)2194-5357
ISSN (Electronic)2194-5365

Conference

ConferenceIntelligent Systems Conference, IntelliSys 2019
Country/TerritoryUnited Kingdom
CityLondon
Period5/09/196/09/19

Keywords

  • Classification
  • MoveIt!
  • Navigation
  • Obstacle avoidance
  • Planning
  • QGroundControl
  • ROS
  • UAV
  • W-K Filter

Fingerprint

Dive into the research topics of 'Terrain classification using w-k filter and 3d navigation with static collision avoidance'. Together they form a unique fingerprint.

Cite this