Guidance: A visual sensing platform for robotic applications

Guyue Zhou, Lu Fang, Ketan Tang, Honghui Zhang, Kai Wang, Kang Yang

Research output: Chapter in Book/Conference Proceeding/ReportConference Paper published in a bookpeer-review

34 Citations (Scopus)

Abstract

Visual sensing, such as vision based localization, navigation, tracking, are crucial for intelligent robots, which have shown great advantage in many robotic applications. However, the market is still in lack of a powerful visual sensing platform to deal with most of the visual processing tasks. In this paper we introduce a powerful and efficient platform, Guidance, which is composed of one processor and multiple (up to five) stereo sensing units. Basic visual tasks including visual odometry, obstacle avoidance, depth generation, are given as built-in functions. Additionally, with the aid of a well documented SDK, Guidance is extremely flexible for users to develop other applications, such as autonomous navigation, SLAM, tracking.

Original languageEnglish
Title of host publication2015 IEEE Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2015
PublisherIEEE Computer Society
Pages9-14
Number of pages6
ISBN (Electronic)9781467367592
DOIs
Publication statusPublished - 19 Oct 2015
Externally publishedYes
EventIEEE Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2015 - Boston, United States
Duration: 7 Jun 201512 Jun 2015

Publication series

NameIEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
Volume2015-October
ISSN (Print)2160-7508
ISSN (Electronic)2160-7516

Conference

ConferenceIEEE Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2015
Country/TerritoryUnited States
CityBoston
Period7/06/1512/06/15

Bibliographical note

Publisher Copyright:
© 2015 IEEE.

Keywords

  • Cameras
  • Feature extraction
  • Simultaneous localization and mapping
  • Visualization

Fingerprint

Dive into the research topics of 'Guidance: A visual sensing platform for robotic applications'. Together they form a unique fingerprint.

Cite this