Bayesian Ensembles: Interactive Demo
Uncertainty in Neural Networks: Approximately Bayesian Ensembling - JS Demo
Interactive demo for a method to capture uncertainty in NNs - presented in our paper. This code trains five fully-connected NNs, each with one hidden layer of 50 nodes. Hyperparameters can be modified at bottom. Switch between unconstrained / regularised / anchored. Click the plot to add data points. Black lines show five individual NN estimates, colour shows the ensemble's predictive uncertainty (+/- 3 standard deviations). We provide a like-for-like comparison on the same data with the dropout method here.
Hyperparameters
Activation function: Prior variance: How noisy do you assume the function is? Data noise variance: How closely do you need to fit the data? Loss function to use for ensemble:Note how the regularised loss function reduces diversity in the ensemble. The unconstrained case overfits the data for the low prior / high data noise variance case. (Click 'Reset NNs' for new hyperparams to take affect)Adapted from Yarin Gal https://github.com/yaringal/DropoutUncertaintyDemos, originally by Andrej Karpathy https://cs.stanford.edu/people/karpathy/convnetjs/demo1/regression.html.