FPGA-based satellite image classification for water bodies detection

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Citations (Scopus)


Land Use/Land Cover classification algorithms have been extensively studied and implemented in Central Processing Units (CPU) and Graphics Processing Units (GPU) based platforms. In this work we present a detailed study of Land Use/Land Cover classification performance in terms of accuracy and computational speed on an Field-Programmable Gate Array (FPGA). Two classification algorithms, Decision Tree and Minimum Distance, are studied to distinguish two categories (i.e. water or no-water). Both algorithms will be performed on FPGA and CPU to confirm the advantages of a parallel approach. Due to the pre-processing techniques used, both implementation on FPGA and CPU shared the same accuracy results, only differing in processing time. The results showed 98.97Decision Tree, and a speed up factor of 4 times FPGA over CPU for the Minimum Distance Classifier. The main goal of this case study is to generate maps that help firefighters in wildfires to locate water areas to refill water tanks. Final results conclude that the output of the classifier can better identify water resources than the ground truth Land Use/Land Cover map (COS) provided by Direção Geral do Território (DGT) Portugal.

Original languageEnglish
Title of host publicationProceedings - 2020 International Young Engineers Forum, YEF-ECE 2020
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Number of pages5
ISBN (Electronic)9781728156781
Publication statusPublished - Jul 2020
Event2020 International Young Engineers Forum, YEF-ECE 2020 - Online, Caparica, Portugal
Duration: 3 Jul 20203 Jul 2020


Conference2020 International Young Engineers Forum, YEF-ECE 2020


  • CPU
  • FPGA
  • GPU
  • Land Use/Land Cover Classifier


Dive into the research topics of 'FPGA-based satellite image classification for water bodies detection'. Together they form a unique fingerprint.

Cite this