firstbacksecondback
27 Results
Spotlight
|
Wed 10:30 |
Churn Reduction via Distillation Heinrich Jiang · Harikrishna Narasimhan · Dara Bahri · Andrew Cotter · Afshin Rostamizadeh |
|
Poster
|
Thu 2:30 |
Prototypical Contrastive Predictive Coding Kyungmin Lee |
|
Poster
|
Mon 2:30 |
Progressive Distillation for Fast Sampling of Diffusion Models Tim Salimans · Jonathan Ho |
|
Poster
|
Thu 2:30 |
Distilling GANs with Style-Mixed Triplets for X2I Translation with Limited Data Yaxing Wang · Joost van de Weijer · Lu Yu · SHANGLING JUI |
|
Spotlight
|
Wed 18:30 |
Online Hyperparameter Meta-Learning with Hypergradient Distillation Hae Beom Lee · Hayeon Lee · JaeWoong Shin · Eunho Yang · Timothy Hospedales · Sung Ju Hwang |
|
Poster
|
Thu 10:30 |
Graph-less Neural Networks: Teaching Old MLPs New Tricks Via Distillation Shichang Zhang · Yozen Liu · Yizhou Sun · Neil Shah |
|
Spotlight
|
Mon 2:30 |
Progressive Distillation for Fast Sampling of Diffusion Models Tim Salimans · Jonathan Ho |
|
Poster
|
Wed 18:30 |
Online Hyperparameter Meta-Learning with Hypergradient Distillation Hae Beom Lee · Hayeon Lee · JaeWoong Shin · Eunho Yang · Timothy Hospedales · Sung Ju Hwang |
|
Workshop
|
Fri 10:25 |
IFACD: Intermediate Features Augmented Contrastive Distillation Edwin Arkel Rios |
|
Workshop
|
Fri 8:20 |
KDSTM: Neural Semi-supervised Topic Modeling with Knowledge Distillation Weijie Xu · Xiaoyu Jiang · Jay Desai · Bin Han · Fuqin Yan · Francis Iannacci |
|
Workshop
|
ConceptDistil: Model-Agnostic Distillation of Concept Explanations João Pedro Sousa · Ricardo Moreira · Vladimir Balayan · Pedro Saleiro · Pedro Bizarro |
||
Workshop
|
Sparse Logits Suffice to Fail Knowledge Distillation Haoyu Ma · Yifan Huang · Hao Tang · Chenyu You · Deying Kong · Xiaohui Xie |