Bayesian Ensembles: Interactive Demo

Bayesian ensembling demo

Uncertainty in Neural Networks: Approximately Bayesian Ensembling - JS Demo

Interactive demo for a method to capture uncertainty in NNs - presented in our paper.

This code trains five fully-connected NNs, each with one hidden layer of 50 nodes.

Hyperparameters can be modified at bottom. Switch between unconstrained / regularised / anchored.

Click the plot to add data points. Black lines show five individual NN estimates, colour shows the ensemble's predictive uncertainty (+/- 3 standard deviations). We provide a like-for-like comparison on the same data with the dropout method here.

Browser not supported for Canvas. Get a real browser.

Hyperparameters

Activation function:
ReLU TanH

Prior variance: How noisy do you assume the function is?
1.0 10.0 100.0

Data noise variance: How closely do you need to fit the data?
0.001 0.01 0.1

Loss function to use for ensemble:
Note how the regularised loss function reduces diversity in the ensemble. The unconstrained case overfits the data for the low prior / high data noise variance case.
Unconstrained Regularised Bayesian (Anchored)

(Click 'Reset NNs' for new hyperparams to take affect)

Adapted from Yarin Gal https://github.com/yaringal/DropoutUncertaintyDemos, originally by Andrej Karpathy https://cs.stanford.edu/people/karpathy/convnetjs/demo1/regression.html.