/ Machine Learning

Keras implementation of Global Context Attention blocks

Keras implementation of Global Context Attention blocks

Keras Global Context Attention Blocks

Keras implementation of the Global Context block from the paper GCNet: Non-local Networks Meet Squeeze-Excitation Networks and Beyond.

Supports Conv1D, Conv2D and Conv3D directly with no modifications.

Usage

Import global_context_block from gc.py and provide it a tensor as input.

from gc import global_context_block

ip = Input(...)
x = ConvND(...)(ip)

# apply Global Context
x = global_context_block(x, reduction_ratio=16, transform_activation='linear')
...

Parameters

There are just two parameters to manage :

 - reduction_ratio: The ratio to scale the transform block.
 - transform_activation: The activation function prior to addition of the input with the context.
                         The paper uses no activation, but `sigmoid` may do better.

Requirements

  • Keras 2.2.4+
  • Tensorflow (1.13+) or CNTK

GitHub