Multi-headed Neural Ensemble Search
ICML Workshop on Uncertainty and Robustness in Deep Learning (UDL), 2021
Abstract: Ensembles of CNN models trained with different seeds (also known as Deep Ensembles) are known to achieve superior performance over a single copy of the CNN. Neural Ensemble Search (NES) can further boost performance by adding architectural diversity. However, the scope of NES remains prohibitive under limited computational resources. In this work, we extend NES to multi-headed ensembles, which consist of a shared backbone attached to multiple prediction heads. Unlike Deep Ensembles, these multi-headed ensembles can be trained end to end, which enables us to leverage one-shot NAS methods to optimize an ensemble objective. With extensive empirical evaluations, we demonstrate that multi-headed ensemble search finds robust ensembles 3 times faster, while having comparable performance to other ensemble search methods, in both predictive performance and uncertainty calibration.
Paper
Images and movies
BibTex reference
@InProceedings{SB21a, author = "A. Narayanan and A. Zela and T. Saikia and T. Brox and F. Hutter", title = "Multi-headed Neural Ensemble Search", booktitle = "ICML Workshop on Uncertainty and Robustness in Deep Learning (UDL)", month = " ", year = "2021", url = "http://lmb.informatik.uni-freiburg.de/Publications/2021/SB21a" }