MNIST_AttentionMap

[TensorFlow] Attention mechanism with MNIST dataset

Usage

$ python run.py

Result

Training

loss

Loss graph.

Test

1--2-

3--2-

5--1-

7

9

Each figure shows input digit, attention map, and overlapped image sequentially.

Further usage

f0

f1

f2

The further usages. Detecting the location of digits can be conducted using an attention map.

Requirements

  • TensorFlow 2.3.0
  • Numpy 1.18.5

GitHub

https://github.com/YeongHyeon/MNIST_AttentionMap