firstbacksecondback
11 Results
Poster
|
Thu 2:30 |
Prototypical Contrastive Predictive Coding Kyungmin Lee |
|
Poster
|
Wed 10:30 |
Better Supervisory Signals by Observing Learning Paths YI REN · Shangmin Guo · Danica J Sutherland |
|
Poster
|
Tue 10:30 |
Feature Kernel Distillation Bobby He · Mete Ozay |
|
Poster
|
Thu 18:30 |
LFPT5: A Unified Framework for Lifelong Few-shot Language Learning Based on Prompt Tuning of T5 Chengwei Qin · Shafiq Joty |
|
Poster
|
Mon 10:30 |
Open-vocabulary Object Detection via Vision and Language Knowledge Distillation Xiuye Gu · Tsung-Yi Lin · Weicheng Kuo · Yin Cui |
|
Poster
|
Tue 10:30 |
Towards Model Agnostic Federated Learning Using Knowledge Distillation Andrei Afonin · Sai Karimireddy |
|
Poster
|
Mon 2:30 |
Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations Fangyu Liu · Yunlong Jiao · Jordan Massiah · Emine Yilmaz · Serhii Havrylov |
|
Workshop
|
Sparse Logits Suffice to Fail Knowledge Distillation Haoyu Ma · Yifan Huang · Hao Tang · Chenyu You · Deying Kong · Xiaohui Xie |
||
Poster
|
Wed 18:30 |
Cold Brew: Distilling Graph Node Representations with Incomplete or Missing Neighborhoods Wenqing Zheng · Edward Huang · Nikhil Rao · Sumeet Katariya · Zhangyang Wang · Karthik Subbian |
|
Workshop
|
Fri 8:20 |
KDSTM: Neural Semi-supervised Topic Modeling with Knowledge Distillation Weijie Xu · Xiaoyu Jiang · Jay Desai · Bin Han · Fuqin Yan · Francis Iannacci |
|
Poster
|
Thu 10:30 |
Bag of Instances Aggregation Boosts Self-supervised Distillation Haohang Xu · Jiemin Fang · XIAOPENG ZHANG · Lingxi Xie · Xinggang Wang · Wenrui Dai · Hongkai Xiong · Qi Tian |