Putting NeRF on a Diet

This project attempted to implement the paper Putting NeRF on a Diet (DietNeRF) in JAX/Flax. DietNeRF is designed for rendering quality novel views in few-shot learning scheme, a task that vanilla NeRF (Neural Radiance Field) struggles. To achieve this, the author coins Semantic Consistency Loss to supervise DietNeRF by prior knowledge from CLIP Vision Transformer. Such supervision enables DietNeRF to learn 3D scene reconstruction with CLIP's prior knowledge on 2D views.

? Demo

  1. You can check out our demo in Hugging Face Space
  2. Or you can set up our Streamlit demo locally (model checkpoints will be fetched automatically upon startup)
pip install -r requirements_demo.txt
streamlit run app.py

Streamlit Demo

✨ Implementation

Our code is written in JAX/ Flax and mainly based upon jaxnerf from Google Research. The base code is highly optimized in GPU & TPU. For semantic consistency loss, we utilize pretrained CLIP Vision Transformer from transformers library.

To learn more about DietNeRF, our experiments and implementation, you are highly recommended to check out our very detailed Notion write-up!

스크린샷 2021-07-04 오후 4 11 51

? Hugging Face Model Hub Repo

You can also find our project and our model checkpoints on our Hugging Face Model Hub Repository. The models checkpoints are located in models folder.

Our JAX/Flax implementation currently supports:

Platform Single-Host GPU Multi-Device TPU
Type Single-Device Multi-Device Single-Host Multi-Host
Training Supported Supported Supported Supported
Evaluation Supported Supported Supported Supported

? Installation

# Clone the repo
git clone https://github.com/codestella/putting-nerf-on-a-diet
# Create a conda environment, note you can use python 3.6-3.8 as
# one of the dependencies (TensorFlow) hasn't supported python 3.9 yet.
conda create --name jaxnerf python=3.6.12; conda activate jaxnerf
# Prepare pip
conda install pip; pip install --upgrade pip
# Install requirements
pip install -r requirements.txt
# [Optional] Install GPU and TPU support for Jax
# Remember to change cuda101 to your CUDA version, e.g. cuda110 for CUDA 11.0.
!pip install --upgrade jax "jax[cuda110]" -f https://storage.googleapis.com/jax-releases/jax_releases.html
# install flax and flax-transformer
pip install flax transformers[flax]

⚽ Dataset

Download the datasets from the NeRF official Google Drive.
Please download the nerf_synthetic.zip and unzip them
in the place you like. Let's assume they are placed under /tmp/jaxnerf/data/.

? How to Train

  1. Train in our prepared Colab notebook: Colab Pro is recommended, otherwise you may encounter out-of-memory
  2. Train locally: set use_semantic_loss=true in your yaml configuration file to enable DietNeRF.
python -m train \
  --data_dir=/PATH/TO/YOUR/SCENE/DATA \ # (e.g. nerf_synthetic/lego)
  --train_dir=/PATH/TO/THE/PLACE/YOU/WANT/TO/SAVE/CHECKPOINTS \
  --config=configs/CONFIG_YOU_LIKE

? Experimental Results

❗ Rendered Rendering images by 8-shot learned DietNeRF

DietNeRF has a strong capacity to generalise on novel and challenging views with EXTREMELY SMALL TRAINING SAMPLES!

HOTDOG / DRUM / SHIP / CHAIR / LEGO / MIC

❗ Rendered GIF by occluded 14-shot learned NeRF and Diet-NeRF

We made artificial occlusion on the right side of image (Only picked left side training poses).
The reconstruction quality can be compared with this experiment.
DietNeRF shows better quality than Original NeRF when It is occluded.

Training poses

LEGO

Diet NeRFNeRF

SHIP

Diet NeRFNeRF

GitHub

https://github.com/codestella/putting-nerf-on-a-diet