In this paper we present a novel method for efficient and effective 3D surface reconstruction in open scenes. Existing Neural Radiance Fields (NeRF) based works typically require extensive training and rendering time due to the adopted implicit representations. In contrast, 3D Gaussian splatting (3DGS) uses an explicit and discrete representation, hence the reconstructed surface is built by the huge number of Gaussian primitives, which leads to excessive memory consumption and rough surface details in sparse GS areas. To address these issues, we propose Gaussian Voxel Kernel Functions (GVKF), which establish a continuous scene representation based on discrete 3D Gaussian Splatting (3DGS) through kernel regression. The GVKF integrate fast 3DGS rasterization and highly effective scene implicit representations, achieving high-fidelity open scene surface reconstruction. Experiments on challenging scene datasets demonstrate the efficiency and effective-ness of our proposed GVKF, featuring with high reconstruction quality, real-time rendering speeds, significant savings in storage and training memory consumption.
* Note: The completeness of background sky modeling stems from GOF's MT algorithm.
Framework of Gaussian Voxel Kernel Functions (GVKF) for scene representation. In this framework, discrete Gaussian primitives $\mathcal{G}$ represent continuous opacity density $\rho(t)$ on the ray via kernel regression. After slightly modifying the rasterization pipeline, the kernel function can be integrated into alpha blending rasterization without introducing dense points sampling. Additionally, we directly define the mapping relationship between the neural opacity field and the implicit surface.
@misc{song2024gvkfgaussianvoxelkernel,
title={GVKF: Gaussian Voxel Kernel Functions for Highly Efficient Surface Reconstruction in Open Scenes},
author={Gaochao Song and Chong Cheng and Hao Wang},
year={2024},
eprint={2411.01853},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2411.01853},
}