Neural Motion Learner

Introduction

This work is to extract skeletal structure from volumetric observations and to learn motion dynamics from the detected skeletal motions in a fully unsupervised manner.

Our model conducts motion generation/interpolation/retargeting based on the learned latent dynamics.

Note that it is an unofficial version of the work so that minimal amounts of codes are provided to demonstrate results.

Full descriptions including title, training codes and data pre-processing methods will be uploaded once the paper of this work is accepted to the conference.

Install

We tested on Python 3.8 and Ubuntu 18.04 LTS.

The architecture is built from Pytorch 1.7.1 with Cuda 11.0.

Creating a conda environment is recommended.

## Download the repository
git clone https://github.com/jinseokbae/neural_motion_learner.git
cd neural_motion_learner
## Create conda env
conda create --name nmotion python=3.8
conda activate nmotion
## modify setup.sh to match your cuda setting
bash setup.sh

Run

Using provided pretrained model, run demo codes to visualize followings:

## Motion generation
python vis_generation.py
## Result will be stored in output/generation

Gen Video

## Motion interpolation
python vis_interpolation.py
## Result will be stored in output/interpolation

Interp Video

## Motion retargeting
python vis_retarget.py
## Result will be stored in output/retarget

Retarget Video

GitHub

View Github