Search All 2021 Events
 

Filter by Keyword:

Results

<<   <   Page 1 of 2   >   >>
Poster
Wed 1:00 Knowledge distillation via softmax regression representation learning
Jing Yang · Brais Martinez · Adrian Bulat · Georgios Tzimiropoulos
Poster
Mon 17:00 Neural Attention Distillation: Erasing Backdoor Triggers from Deep Neural Networks
Yige Li · Xixiang Lyu · Nodens Koren · Lingjuan Lyu · Bo Li · Xingjun Ma
Poster
Mon 17:00 Undistillable: Making A Nasty Teacher That CANNOT teach students
Haoyu Ma · Tianlong Chen · Ting-Kuei Hu · Chenyu You · Xiaohui Xie · Zhangyang Wang
Poster
Thu 9:00 A teacher-student framework to distill future trajectories
Alexander Neitz · Giambattista Parascandolo · Bernhard Schoelkopf
Spotlight
Wed 20:40 Undistillable: Making A Nasty Teacher That CANNOT teach students
Haoyu Ma · Tianlong Chen · Ting-Kuei Hu · Chenyu You · Xiaohui Xie · Zhangyang Wang
Poster
Mon 1:00 Improve Object Detection with Feature-based Knowledge Distillation: Towards Accurate and Efficient Detectors
Linfeng Zhang · Kaisheng Ma
Poster
Thu 9:00 Efficient Transformers in Reinforcement Learning using Actor-Learner Distillation
Emilio Parisotto · Ruslan Salakhutdinov
Poster
Thu 9:00 Rethinking Soft Labels for Knowledge Distillation: A Bias–Variance Tradeoff Perspective
Helong Zhou · Liangchen Song · Jiajie Chen · Ye Zhou · Guoli Wang · Junsong Yuan · Qian Zhang
Poster
Thu 17:00 Generalization bounds via distillation
Daniel Hsu · Ziwei Ji · Matus Telgarsky · Lan Wang
Poster
Mon 1:00 Is Label Smoothing Truly Incompatible with Knowledge Distillation: An Empirical Study
Zhiqiang Shen · Zhiqiang Shen · Dejia Xu · Zitian Chen · Kwang-Ting Cheng · Marios Savvides
Poster
Tue 17:00 Knowledge Distillation as Semiparametric Inference
Tri Dao · Govinda Kamath · Vasilis Syrgkanis · Lester Mackey
Poster
Mon 17:00 MixKD: Towards Efficient Distillation of Large-scale Language Models
Kevin Liang · Weituo Hao · Dinghan Shen · Yufan Zhou · Weizhu Chen · Changyou Chen · Lawrence Carin