Paper |
This repo contains the official implementation for the paper "SDLKF: Signed Distance Linear Kernel Function for Surface Reconstruction. Our work represents a scene with a set of Linear SDF Kernel Function which has an analytic solution for volume rendering instead of numeric approximation.
# download
git clone https://github.com/otakuxiang/GUDF.git --recursive
# if you have an environment used for 3dgs, use it
conda env create --file environment.yml
conda activate surfel_splattingNotice that nvcc-11.8 is required before create the conda env, and gcc-11. g++-11 is alos needed to match nvcc.
To train a scene, simply use
python train.py -s <path to COLMAP or NeRF Synthetic dataset>Commandline arguments for regularizations
--multi_view_ncc_weight # hyperparameter for multiview ncc loss
--lambda_normal # hyperparameter for normal regTo export a mesh within a bounded volume, simply use
python render_pgsr.py -m <path to pre-trained model> -s <path to COLMAP dataset> Commandline arguments you should adjust accordingly for meshing for bounded TSDF fusion, use
--voxel_size # voxel sizeIf you have downloaded the DTU dataset, you can use
python train.py -s <path_to_scene> -m output/dtu/<scanid> -r 2 --depth_ratio 0. --multi_view_weight_from_iter 7000 --preload_img --multi_view_ncc_weight 0.5
python render_pgsr.py --iteration 20000 -m output/dtu/<scanid> --depth_ratio 0 We have provide the script for running DTU dataset, please see script
We use the preprocessed dtu data by 2dGS.
For Tanks and Temples dataset, please download the preprocessed data
This project is built upon 2DGS. The TSDF fusion for extracting mesh is based on Open3D. The rendering script for MipNeRF360 is adopted from Multinerf, while the evaluation scripts for DTU and Tanks and Temples dataset are taken from DTUeval-python and TanksAndTemples, respectively. We thank all the authors for their great repos.
