Depth Image Super Resolution for 3D Reconstruction of Oil Reflnery Buildings


Citar

Texto integral

Acesso aberto Acesso aberto
Acesso é fechado Acesso está concedido
Acesso é fechado Somente assinantes

Resumo

Time-of-Flight (ToF) camera can collect the depth data of dynamic scene surface in real time, which has been applied to 3D reconstruction of refinery buildings. However; due to the limitations of sensor hardware, the resolution of the depth image obtained is very low, so it cannot meet the requirements of dense depth of scene in practical applications such as 3D reconstruction. Therefore, it is necessary to make a breakthrough in software and design a good algorithm to improve the resolution of depth image. We propose of an algorithm of depth image super-resolution by using fusion of multiple progressive convolution neural networks, which uses a context-based network fusion framework to fuse multiple different progressive networks, so as to improve individual network performance and efficiency while maintaining the simplicity of network training. Finally, we have carried out experiments on the public data set, and the experimental results show that the proposed algorithm has reached or even exceeded the most advanced algorithms at present.

Sobre autores

Shuaihao Li

School of Computer Science, Wuhan University

Email: chenmin@whu.edu.cn
República Popular da China, Wuhan

Bin Zhang

School of Remote Sensing and Information Engineering, Wuhan University; Department of Computer Science, City University of Hong Kong

Email: chenmin@whu.edu.cn
República Popular da China, Wuhan; Kowloon Tong

Xinfeng Yang

School of Computer Science, Wuhan University

Email: chenmin@whu.edu.cn
República Popular da China, Wuhan

Yanxiang He

School of Computer Science, Wuhan University

Email: chenmin@whu.edu.cn
República Popular da China, Wuhan

Min Chen

School of Computer Science, Wuhan University

Autor responsável pela correspondência
Email: chenmin@whu.edu.cn
República Popular da China, Wuhan


Declaração de direitos autorais © Springer Science+Business Media, LLC, part of Springer Nature, 2019

Este site utiliza cookies

Ao continuar usando nosso site, você concorda com o procedimento de cookies que mantêm o site funcionando normalmente.

Informação sobre cookies