Dynamic Fusion Module Evolves Drivable Area and Road Anomaly Detection: A Benchmark and Algorithms

Hengli Wang, Rui Fan, Yuxiang Sun, Ming Liu*

*Corresponding author for this work

Research output: Contribution to journalJournal Articlepeer-review

Abstract

Joint detection of drivable areas and road anomalies is very important for mobile robots. Recently, many semantic segmentation approaches based on convolutional neural networks (CNNs) have been proposed for pixelwise drivable area and road anomaly detection. In addition, some benchmark datasets, such as KITTI and Cityscapes, have been widely used. However, the existing benchmarks are mostly designed for self-driving cars. There lacks a benchmark for ground mobile robots, such as robotic wheelchairs. Therefore, in this article, we first build a drivable area and road anomaly detection benchmark for ground mobile robots, evaluating existing state-of-the-art (SOTA) single-modal and data-fusion semantic segmentation CNNs using six modalities of visual features. Furthermore, we propose a novel module, referred to as the dynamic fusion module (DFM), which can be easily deployed in existing data-fusion networks to fuse different types of visual features effectively and efficiently. The experimental results show that the transformed disparity image is the most informative visual feature and the proposed DFM-RTFNet outperforms the SOTAs. In addition, our DFM-RTFNet achieves competitive performance on the KITTI road benchmark.

Original languageEnglish
Pages (from-to)10750-10760
Number of pages11
JournalIEEE Transactions on Cybernetics
Volume52
Issue number10
DOIs
Publication statusPublished - 1 Oct 2022

Bibliographical note

Publisher Copyright:
© 2013 IEEE.

Keywords

  • Deep learning in robotics and automation
  • dynamic fusion
  • mobile robots
  • semantic scene understanding

Fingerprint

Dive into the research topics of 'Dynamic Fusion Module Evolves Drivable Area and Road Anomaly Detection: A Benchmark and Algorithms'. Together they form a unique fingerprint.

Cite this