Real-Time Globally Consistent Dense 3D Reconstruction with Online Texturing

Lei Han, Siyuan Gu, Dawei Zhong, Shuxue Quan, Lu Fang*

*Corresponding author for this work

Research output: Contribution to journalJournal Articlepeer-review

8 Citations (Scopus)

Abstract

High-quality reconstruction of 3D geometry and texture plays a vital role in providing immersive perception of the real world. Additionally, online computation enables the practical usage of 3D reconstruction for interaction. We present an RGBD-based globally-consistent dense 3D reconstruction approach, where high-quality (i.e., the spatial resolution of the RGB image) texture patches are mapped on high-resolution (leq 1 cm≤1cm) geometric models online. The whole pipeline uses merely the CPU computing of a portable device. For real-time geometric reconstruction with online texturing, we propose to solve the texture optimization problem with a simplified incremental MRF solver in the context of geometric reconstruction pipeline using sparse voxel sampling strategy. An efficient reference-based color adjustment scheme is also proposed to achieve consistent texture patch colors under inconsistent luminance situations. Quantitative and qualitative experiments demonstrate that our online scheme achieves a realistic visualization of the environment with more abundant details, while taking fairly compact memory consumption and much lower computational complexity than existing solutions.

Original languageEnglish
Pages (from-to)1519-1533
Number of pages15
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Volume44
Issue number3
DOIs
Publication statusPublished - 1 Mar 2022
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 1979-2012 IEEE.

Keywords

  • CPU computing
  • Real-time 3D reconstruction
  • SLAM
  • TSDF fusion
  • global consistency
  • online texturing

Fingerprint

Dive into the research topics of 'Real-Time Globally Consistent Dense 3D Reconstruction with Online Texturing'. Together they form a unique fingerprint.

Cite this