Skip to yearly menu bar Skip to main content


Search All 2021 Events
 

17 Results

<<   <   Page 1 of 2   >   >>
Spotlight
Wed 20:40 Undistillable: Making A Nasty Teacher That CANNOT teach students
Haoyu Ma · Tianlong Chen · Ting-Kuei Hu · Chenyu You · Xiaohui Xie · Zhangyang Wang
Spotlight
Mon 12:05 Generalization bounds via distillation
Daniel Hsu · Ziwei Ji · Matus Telgarsky · Lan Wang
Poster
Thu 9:00 Dataset Meta-Learning from Kernel Ridge-Regression
Timothy Nguyen · Zhourong Chen · Jaehoon Lee
Poster
Mon 17:00 Undistillable: Making A Nasty Teacher That CANNOT teach students
Haoyu Ma · Tianlong Chen · Ting-Kuei Hu · Chenyu You · Xiaohui Xie · Zhangyang Wang
Poster
Thu 9:00 A teacher-student framework to distill future trajectories
Alexander Neitz · Giambattista Parascandolo · Bernhard Schoelkopf
Poster
Thu 9:00 Initialization and Regularization of Factorized Neural Layers
Mikhail Khodak · Neil Tenenholtz · Lester Mackey · Nicolo Fusi
Poster
Mon 1:00 Is Label Smoothing Truly Incompatible with Knowledge Distillation: An Empirical Study
Zhiqiang Shen · Zhiqiang Shen · Dejia Xu · Zitian Chen · Kwang-Ting Cheng · Marios Savvides
Poster
Thu 9:00 Rethinking Soft Labels for Knowledge Distillation: A Bias–Variance Tradeoff Perspective
Helong Zhou · Liangchen Song · Jiajie Chen · Ye Zhou · Guoli Wang · Junsong Yuan · Qian Zhang
Poster
Thu 1:00 Distilling Knowledge from Reader to Retriever for Question Answering
Gautier Izacard · Edouard Grave
Poster
Mon 17:00 Neural Attention Distillation: Erasing Backdoor Triggers from Deep Neural Networks
Yige Li · Xixiang Lyu · Nodens Koren · Lingjuan Lyu · Bo Li · Xingjun Ma
Poster
Wed 9:00 SEED: Self-supervised Distillation For Visual Representation
Zhiyuan Fang · Jianfeng Wang · Lijuan Wang · Lei Zhang · 'YZ' Yezhou Yang · Zicheng Liu
Poster
Thu 9:00 Efficient Transformers in Reinforcement Learning using Actor-Learner Distillation
Emilio Parisotto · Ruslan Salakhutdinov