/ Natural Language Processing

Minimal Interactive Attention Visualization

Minimal Interactive Attention Visualization

Interactive Attention Visualization

A small example of an interactive visualization for attention values as being used by transformer language models like GPT2 and BERT.

by Hendrik Strobelt and Sebastian Gehrmann for the SIDN IAP class at MIT, Jan 2020

attnvis

Preparation

  • Install Anaconda or Miniconda
  • run conda env create -f environment.yml to create a new environment called attnvis

Running the demo

  • activate conda environment: conda activate attnvis
  • run server: python server.py
  • visit http://localhost:8888/

Structure

api.py             -- contains the interface to the pytorch/huggingface backend 
server.py          -- defines a REST interface for the api.py calls

client/*           -- contains all client files
client/index.html  -- main file including JS code
client/styles.css  -- all CSS styles defined in here
client/tools.js    -- helper functions

Libraries used

GitHub