JISE


  [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18]


Journal of Information Science and Engineering, Vol. 31 No. 4, pp. 1309-1327


A Unified Probabilistic Framework for Real-Time Depth Map Fusion


YONG DUAN, MINGTAO PEI, YUCHENG WANG, MIN YANG, XIAMENG QIN AND YUNDE JIA 
Beijing Lab of Intelligent Information, School of Computer Science 
Beijing Institute of Technology 
Beijing, 100081 P.R. China 
E-mail: {duanyong; peimt; jiayunde}@bit.edu.cn


    This paper proposes a unified probabilistic framework for real-time depth map fusion. By modeling the depth imaging process as a random experiment, the depth map fusion is converted into probability density function (pdf) estimation. The depth fusion problem is decoupled into four parts: the fusion space, the influence term, the visibility term and the confidence term. We combine these four terms in a unified probabilistic framework, and apply the framework in two cases to evaluate the performance. In the first case, multiple stereo vision cameras are used to acquire multiple depth map streams from multiple viewpoints simultaneously in real time. In the second case, two cameras and a Kinect are combined to provide two depth map streams. In both cases, strategies for each part of the framework are presented to perform real-time depth fusion. Experimental results show that the proposed framework is promising for real-time depth map fusion.


Keywords: stereo vision, real-time multi-view stereo, depth map fusion

  Retrieve PDF document (JISE_201504_09.pdf)