ReLU-GP Residual (RGPR)

This repository contains code for reproducing the following NeurIPS 2021 paper:

  title={An infinite-feature extension for {B}ayesian {ReLU} nets that fixes their asymptotic overconfidence},
  author={Kristiadi, Agustinus and Hein, Matthias and Hennig, Philipp},



  • eval_*.py scripts are for running experiments.
  • aggregate_*.py scripts are for processing experiment results, to make them paper-ready.

RGPR-specific code

  • The implementation of the double-sided cubic spline (DSCS) kernel is in rgpr/
  • To apply RGPR on a BNN, see the respective prediction code (no retraining required):
    • The predict function in laplace/ for last-layer Laplace.
    • The predict function in laplace/ for general BNNs (with Monte Carlo sampling).
  • To do hyperparameter search for RGPR, see the get_best_kernel_var function in
  • To generate mean and standard deviation of an NN’s activations (required by the non-asymptotic extension of RGPR), use

Running the code:

  1. Install dependencies (check requirements.txt). We use Python 3.7.
  2. Install BackPACK:
  3. In util/, change path = ... to your liking.

Pre-trained models are in


GitHub - wiseodd/rgpr at
Companion code for the paper “An Infinite-Feature Extension for Bayesian ReLU Nets That Fixes Their Asymptotic Overconfidence” (NeurIPS 2021) - GitHub - wiseodd/rgpr at