Deep Exemplar-based Colorization

This is the implementation of paper Deep Exemplar-based Colorization by Mingming He*, Dongdong Chen*, Jing Liao, Pedro V. Sander and Lu Yuan in ACM Transactions on Graphics (SIGGRAPH 2018) (*indicates equal contribution).

Deep Exemplar-based Colorization is the first deep learning approach for exemplar-based local colorization. Given a reference color image, our convolutional neural network directly maps a grayscale image to an output colorized image.

Deep-Exemplar-based-Colorization

The proposed network consists of two sub-networks, Similarity Sub-net which computes the semantic similarities between the reference and the target, and Colorization Sub-net which selects, propagates and predicts the chrominances channels of the target.

The input includes a grayscale target image, a color reference image and bidirectional mapping functions. We use Deep Image Analogy as default to generate bidirectional mapping functions. It is applicable to replace with other dense correspondence estimation algorithms.

GitHub