Sandwich Batch Normalization
Code for Sandwich Batch Normalization.
We present Sandwich Batch Normalization (SaBN), an extremely easy improvement of Batch Normalization (BN) with only a few lines of code changes.
We demonstrate the prevailing effectiveness of SaBN as a drop-in replacement in four tasks:
- conditional image generation,
- neural architecture search,
- adversarial training,
- arbitrary neural style transfer.
Check each of them for more information:
1. Conditional Image Generation
Using SaBN in conditional generation task enables an immediate performance boost. Evaluation results on CIFAR-10 are shown below:
|Model||Inception Score ↑||FID ↓|
|AutoGAN-SaBN (ours)||8.72 (+0.29)||9.11 (−1.40)|
|BigGAN-SaBN (ours)||9.01 (+0.10)||8.03 (−0.54)|
|SNGAN-SaBN (ours)||8.89 (+0.13)||8.97 (−1.21)|
Visual results on ImageNet (128*128 resolution):
2. Neural Architecture Search
We adopted DARTS as the baseline search algorithm. Results on NAS-Bench-201 are presented below:
|Method||CIFAR-100 (top1)||ImageNet (top1)|
|DARTS||44.05 ± 7.47||36.47 ± 7.06|
|DARTS-SaBN (ours)||71.56 ± 1.39||45.85 ± 0.72|
3. Adversarial Training
|Evaluation||BN||AuxBN (clean branch)||SaAuxBN (clean branch) (ours)|
|Evaluation||BN||AuxBN (adv branch)||SaAuxBN (adv branch) (ours)|
4. Arbitrary Neural Style Transfer
The model equipped with the proposed SaAdaIN achieves lower style & content loss on both training and testing set.
|Training style loss||Training content loss|
|Validation style loss||Validation content loss|