Wisdom of Committees: An Overlooked Approach To Faster and More Accurate Models

Xiaofang Wang · Dan Kondratyuk · Eric Christiansen · Kris Kitani · Yair Movshovitz-Attias · Elad Eban


Keywords: [ ensemble ] [ efficiency ]

[ Abstract ]
[ Visit Poster at Spot A2 in Virtual World ] [ Slides [ OpenReview
Tue 26 Apr 10:30 a.m. PDT — 12:30 p.m. PDT


Committee-based models (ensembles or cascades) construct models by combining existing pre-trained ones. While ensembles and cascades are well-known techniques that were proposed before deep learning, they are not considered a core building block of deep model architectures and are rarely compared to in recent literature on developing efficient models. In this work, we go back to basics and conduct a comprehensive analysis of the efficiency of committee-based models. We find that even the most simplistic method for building committees from existing, independently pre-trained models can match or exceed the accuracy of state-of-the-art models while being drastically more efficient. These simple committee-based models also outperform sophisticated neural architecture search methods (e.g., BigNAS). These findings hold true for several tasks, including image classification, video classification, and semantic segmentation, and various architecture families, such as ViT, EfficientNet, ResNet, MobileNetV2, and X3D. Our results show that an EfficientNet cascade can achieve a 5.4x speedup over B7 and a ViT cascade can achieve a 2.3x speedup over ViT-L-384 while being equally accurate.

Chat is not available.