DeepSurfels

This is the official implementation of the CVPR 2021 submission DeepSurfels: Learning Online Appearance Fusion

DeepSurfels is a novel 3D representation for geometry and appearance information that combines planar surface primitives with voxel grid representation for improved scalability and rendering quality.

If you find our code or paper useful, please consider citing

@InProceedings{DeepSurfels:CVPR:21,
    title = {{DeepSurfels}: Learning Online Appearance Fusion},
    author = {Mihajlovic, Marko and Weder, Silvan and Pollefeys, Marc and Oswald, Martin R.},
    booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
    month = {June},
    year = {2021},
}

Contact Marko Mihajlovic for questions or open an issue / a pull request.

Installation

The prerequest is to install python packages specified in the requirements.txt file, which can conveniently
accomplished by using an Anaconda environment.

# clone the repo
git clone https://github.com/onlinereconstruction/deep_surfels.git
cd ./deep_surfels

# create environment
conda env create -f environment.yml
conda activate deep_surfels

Then install the deep_surfel package via pip

pip install ./deep_surfel

Data

Directory ./data_prep/data_samples contains preprocessed toy data samples.
See ./data_prep/from_depth_frames.py on how to prepare your own dataset.

Usage

To run the deterministic fusion:

cd appearance_fusion
python test.py -c ../configurations/sample_deterministic.yml --extract_meshes

To trained the learned module:

python train.py -c ../configurations/sample.yml

To evaluate the trained module:

python test.py -c ../configurations/sample.yml --extract_meshes

The rendered images will be stored in the specified logging_root_dir directory.
See ./appearance_fusion/config.py for all available configuration parameters.

GitHub

https://github.com/onlinereconstruction/deep_surfels