![]() |
|
||
Depth Enhancement by Fusion for Passive and Active Sensing*Frederic Garcia1, Djamila Aouada1, Hashim Kemal Abdella1, 2, Thomas Solignac3, Bruno Mirbach3, and Björn Ottersten1 1Interdisciplinary Centre for Security, Reliability and Trust, Universtity of Luxembourg Luxembourg
2Université de Bourgogne, France 3Advanced Engineering - IEE S.A., Luxembourg
Abstract. This paper presents a general refinement procedure that enhances any given depth map obtained by passive or active sensing. Given a depth map, either estimated by triangulation methods or directly provided by the sensing system, and its corresponding 2-D image, we correct the depth values by separately treating regions with undesired effects such as empty holes, texture copying or edge blurring due to homogeneous regions, occlusions, and shadowing. In this work, we use recent depth enhancement filters intended for Time-of-Flight cameras, and adapt them to alternative depth sensing modalities, both active using an RGB-D camera and passive using a dense stereo camera. To that end, we propose specific masks to tackle areas in the scene that require a special treatment. Our experimental results show that such areas are satisfactorily handled by replacing erroneous depth measurements with accurate ones. Keywords: depth enhancement, data fusion, passive sensing, active sensing *This work was supported by the National Research Fund, Luxembourg, under the CORE project C11/BM/1204105/FAVE/Ottersten. LNCS 7585, p. 506 ff. lncs@springer.com
|