Skip to yearly menu bar Skip to main content


Search All 2023 Events
 

27 Results

<<   <   Page 1 of 3   >   >>
Oral
Wed 6:20 Towards Understanding Ensemble, Knowledge Distillation and Self-Distillation in Deep Learning
Zeyuan Allen-Zhu · Yuanzhi Li
Poster
Towards Understanding Ensemble, Knowledge Distillation and Self-Distillation in Deep Learning
Zeyuan Allen-Zhu · Yuanzhi Li
Poster
IDEAL: Query-Efficient Data-Free Learning from Black-Box Models
Jie Zhang · Chen Chen · Lingjuan Lyu
Poster
Wed 2:30 NORM: Knowledge Distillation via N-to-One Representation Matching
Xiaolong Liu · Lujun Li · Chao Li · Anbang Yao
Poster
Learning MLPs on Graphs: A Unified View of Effectiveness, Robustness, and Efficiency
Yijun Tian · Chuxu Zhang · Zhichun Guo · Xiangliang Zhang · Nitesh Chawla
Poster
Mon 2:30 3D Segmenter: 3D Transformer based Semantic Segmentation via 2D Panoramic Distillation
ZHENNAN WU · YANG LI · Yifei Huang · Lin Gu · Tatsuya Harada · Hiroyuki Sato
Poster
Mon 7:30 Scaffolding a Student to Instill Knowledge
Anil Kag · Durmus Alp Emre Acar · Aditya Gangrade · Venkatesh Saligrama
Poster
Pseudo-label Training and Model Inertia in Neural Machine Translation
Benjamin Hsu · Anna Currey · Xing Niu · Maria Nadejde · georgiana dinu
Poster
Knowledge Distillation based Degradation Estimation for Blind Super-Resolution
Bin Xia · Yulun Zhang · Yitong Wang · Yapeng Tian · Wenming Yang · Radu Timofte · Luc Van Gool
Poster
Better Teacher Better Student: Dynamic Prior Knowledge for Knowledge Distillation
Martin Zong · Zengyu Qiu · Xinzhu Ma · Kunlin Yang · Chunya Liu · Jun Hou · Shuai Yi · Wanli Ouyang
Workshop
Robust Neural Architecture Search by Cross-Layer Knowledge Distillation
Utkarsh Nath · Yancheng Wang · Yingzhen Yang
Oral
Wed 6:20 The Modality Focusing Hypothesis: Towards Understanding Crossmodal Knowledge Distillation
Zihui Xue · Zhengqi Gao · Sucheng Ren · Hang Zhao