Continuous-Time Meta-Learning with Forward Mode Differentiation

ICLR 2022 (Spotlight)InstallationExampleCitation

This repository contains the official implementation in JAX of COMLN (Deleu et al., 2022), a gradient-based meta-learning algorithm, where adaptation follows a gradient flow. It contains an implementation of the memory-efficient algorithm to compute the meta-gradients, based on forward-mode differentiation. The implementation is based on jax-meta.


To avoid any conflict with your existing Python setup, we are suggesting to work in a virtual environment:

python -m venv venv
source venv/bin/activate

Follow these instructions to install the version of JAX corresponding to your versions of CUDA and CuDNN. Note that if you want to test COMLN from the example notebook, you must also install Jupyter notebook.

git clone
cd jax-comln
pip install -r requirements.txt


If you want to cite COMLN, use the following Bibtex entry:

    title={{Continuous-Time Meta-Learning with Forward Mode Differentiation}},
    author={Deleu, Tristan and Kanaa, David and Feng, Leo and Kerg, Giancarlo and Bengio, Yoshua and Lajoie, Guillaume and Bacon, Pierre-Luc},
    booktitle={Tenth International Conference on Learning Representations},


View Github