This project is based on RIFE and aims to make RIFE more practical for users by adding various features and design new models. Because improving the PSNR index is not compatible with subjective effects, we hope this part of work and our academic research are independent of each other. To reduce development difficulty, this project is for engineers and developers.

16X interpolation results from two input images:




Model List

v3.8 - 2021.6.17 | Google Drive | 百度网盘, 密码:kxr3 || v3.5 - 2021.6.12 | Google Drive | 百度网盘, 密码:1rb7

v3.1 - 2021.5.17 | Google Drive | 百度网盘, 密码:64bz || v3.0 - 2021.5.15 | Google Drive | 百度网盘, 密码:tgmd


git clone [email protected]:hzwer/Practical-RIFE.git
cd Practical-RIFE
pip3 install -r requirements.txt

Download a model from the model list and put *.py and flownet.pkl on train_log/


Video Frame Interpolation

You can use our demo video or your video.

python3 --exp=1 --video=video.mp4 

(generate video_2X_xxfps.mp4)

python3 --exp=2 --video=video.mp4

(for 4X interpolation)

python3 --exp=1 --video=video.mp4 --scale=0.5

(If your video has high resolution, such as 4K, we recommend set --scale=0.5 (default 1.0))

python3 --exp=2 --img=input/

(to read video from pngs, like input/0.png ... input/612.png, ensure that the png names are numbers)

python3 --exp=2 --video=video.mp4 --fps=60

(add slomo effect, the audio will be removed)

The warning info, 'Warning: Your video has *** static frames, it may change the duration of the generated video.' means that your video has changed the frame rate by adding static frames. It is common if you have processed 25FPS video to 30FPS.

To-do List

Multi-frame input of the model

Frame interpolation at any time location

Eliminate artifacts as much as possible

Make the model applicable under any resolution input

Provide models with lower calculation consumption


  title={RIFE: Real-Time Intermediate Flow Estimation for Video Frame Interpolation},
  author={Huang, Zhewei and Zhang, Tianyuan and Heng, Wen and Shi, Boxin and Zhou, Shuchang},
  journal={arXiv preprint arXiv:2011.06294},