DynamicNeuralGarments

Introduction

This repository contains the implemetation of Dynamic Nerual Garments proposed in Siggraph Asia 2021.

./GarmentMotionRenderer

This folder contains the pytorch implementation of the rendering network.

You can play with the code by running “run.py” after downloading data and checkpoint from here. You will get the similar results as shown below by respectively running the check point from multilayers, tango, twolayers

In case you want to retrain the network, you can download the training data from here. We provide the 3D meshes of coarse garment, and the target garments including multilayers, tango, and twolayers. You will first run the blender project to generate the groundtruth rendering, background rendering and save the corresponding camera poses. Next, you need to compute the texture sampling map by referring to the code in ./PixelSample. After gathering all needed data, you can give a shot at “train.py”.

./PixelSample

This folder contains the code to generate texture sampling map. The C++ code relies on OpenCV, and Embree.

./blenderScript

This folder contains python scripts for blender.

Citation

If you use our code or model, please cite our paper:

@article{zhang2021dynamic,
title={Dynamic Neural Garments},
author={Zhang, Meng and Ceylan, Duygu and Wang, Tuanfeng and Mitra, Niloy J},
journal={arXiv preprint arXiv:2102.11811},
year={2021}
}

GitHub

View Github