MLP-Mixer

Unoffical Implementation of MLP-Mixer, easy to use with terminal. Train and test easly.

https://arxiv.org/abs/2105.01601

N|Solid

MLP-Mixer is an architecture based exclusively on multi-layer perceptrons (MLPs).

According to paper, Model offers:

  • Better accuracy than CNNs and Transformers
  • Lower time complexity than CNNs and Transformers
  • Lower parameters than CNNs and Transformers

Quick Start

Clone the repo and install the requirements.txt in a Python>=3.8 environment.

git clone https://github.com/Oguzhanercan/MLP-Mixer
cd MLP-Mixer
pip install -r requirements.txt

Dataset

There are 2 options for dataset. You can use pre-defined datasets listed below

  • CIFAR10
  • Mnist
  • Fashion Mnist

or you can use your own dataset. Organize your folder structure as:

      data---
            |
            --0
               |
                --img0.png
                .
                .
                --img9999.png
            |
            -- 1
                |
                --img0.png
                .
                .
                --img9999.png
            .
            .

0 and 1 represents folders that contains images belongs only one particular class. There is no limit for classes or images.

Train

Open a terminal at the same directory of clone. Then run the code below.

python main.py --mode train --dataset CIFAR10 --save True --device cuda --epochs 20 --valid_per 0.2 

You can customize the model hyperparameters, all arguments listed below
“Arguments:

  • dataset
  • train_path
  • test_path
  • batch_size
  • im_size
  • valid_per
  • epochs
  • learning_rate
  • beta1
  • beta2
  • n_classes
  • cuda
  • -eveluate_per_epoch
  • save_model
  • model_path
Custom dataset mode should include following arguments: mode,dataset,train_path,n_classes,im_size

GitHub

View Github