Filter by Keyword:

9 Results

Poster
Mon 1:00 Training with Quantization Noise for Extreme Model Compression
Pierre Stock, Angela Fan, Benjamin Graham, Edouard Grave, Rémi Gribonval, Hervé Jégou, Armand Joulin
Poster
Mon 1:00 Improve Object Detection with Feature-based Knowledge Distillation: Towards Accurate and Efficient Detectors
Linfeng Zhang, Kaisheng Ma
Poster
Mon 1:00 Overfitting for Fun and Profit: Instance-Adaptive Data Compression
Ties van Rozendaal, Iris Huijben, Taco Cohen
Poster
Tue 9:00 UMEC: Unified model and embedding compression for efficient recommendation systems
Jiayi Shen, Haotao Wang, Shupeng Gui, Jianchao Tan, Zhangyang Wang, Ji Liu
Poster
Tue 17:00 Knowledge Distillation as Semiparametric Inference
Tri Dao, Govinda Kamath, Vasilis Syrgkanis, Lester Mackey
Poster
Wed 1:00 Knowledge distillation via softmax regression representation learning
Jing Yang, Brais Martinez, Adrian Bulat, Georgios Tzimiropoulos
Poster
Thu 9:00 Initialization and Regularization of Factorized Neural Layers
Misha Khodak, Neil Tenenholtz, Lester Mackey, Nicolo Fusi
Poster
Thu 17:00 Neural Pruning via Growing Regularization
Huan Wang, Can Qin, Yulun Zhang, Yun Fu
Workshop
Fri 6:00 Keynote 3: Ehsan Saboori. Title: Deep learning model compression using neural network design space exploration