mlp-mixer-tf

Unofficial Implementation of MLP-Mixer [abs, pdf] in TensorFlow.

Note: This project may have some bugs in it. I'm still learning how to implement papers from scratch. Any help appreciated :D

Installation and Usage

The package uses purely TensorFlow. Make sure you have version 2.X:

pip install tensorflow
git clone https://github.com/rish-16/mlp-mixer-tf.git
cd mlp-mixer-tf
python main.py

The unofficial wrapper style is inspired by Phil Wang's work on Transformers and Attention (big fan!).

import tensorflow as tf
from mlp_mixer_tf import MLPMixer

model = MLPMixer(
    n_classes=1000,
    image_size=256,
    n_channels=1,
    patch_size=16,
    depth=6,
    hdim=512
)

img = tf.random.uniform([1, 256, 256])
pred = model(img) # [1, 1000]

You can even access specific blocks like MLP and MixerLayer from the mlp_mixer_tf package.

GitHub

https://github.com/rish-16/mlp-mixer-tf