Skip to yearly menu bar Skip to main content


Search All 2022 Events
 

27 Results

<<   <   Page 1 of 3   >   >>
Poster
Mon 2:30 Trans-Encoder: Unsupervised sentence-pair modelling through self- and mutual-distillations
Fangyu Liu · Yunlong Jiao · Jordan Massiah · Emine Yilmaz · Serhii Havrylov
Poster
Tue 2:30 OBJECT DYNAMICS DISTILLATION FOR SCENE DECOMPOSITION AND REPRESENTATION
Qu Tang · Xiangyu Zhu · Zhen Lei · Zhaoxiang Zhang
Poster
Tue 10:30 Towards Model Agnostic Federated Learning Using Knowledge Distillation
Andrei Afonin · Sai Karimireddy
Poster
Mon 10:30 Open-vocabulary Object Detection via Vision and Language Knowledge Distillation
Xiuye Gu · Tsung-Yi Lin · Weicheng Kuo · Yin Cui
Poster
Thu 18:30 Reliable Adversarial Distillation with Unreliable Teachers
Jianing ZHU · Jiangchao Yao · Bo Han · Jingfeng Zhang · Tongliang Liu · Gang Niu · Jingren Zhou · Jianliang Xu · Hongxia Yang
Poster
Thu 10:30 Unsupervised Semantic Segmentation by Distilling Feature Correspondences
Mark Hamilton · Zhoutong Zhang · Bharath Hariharan · Noah Snavely · William Freeman
Poster
Thu 18:30 LFPT5: A Unified Framework for Lifelong Few-shot Language Learning Based on Prompt Tuning of T5
Chengwei Qin · Shafiq Joty
Poster
Wed 10:30 Better Supervisory Signals by Observing Learning Paths
YI REN · Shangmin Guo · Danica J Sutherland
Poster
Thu 10:30 Bag of Instances Aggregation Boosts Self-supervised Distillation
Haohang Xu · Jiemin Fang · XIAOPENG ZHANG · Lingxi Xie · Xinggang Wang · Wenrui Dai · Hongkai Xiong · Qi Tian
Poster
Tue 10:30 Data Efficient Language-Supervised Zero-Shot Recognition with Optimal Transport Distillation
Bichen Wu · Ruizhe Cheng · Peizhao Zhang · Tianren Gao · Joseph E Gonzalez · Peter Vajda
Poster
Wed 10:30 Churn Reduction via Distillation
Heinrich Jiang · Harikrishna Narasimhan · Dara Bahri · Andrew Cotter · Afshin Rostamizadeh
Poster
Thu 10:30 Unified Visual Transformer Compression
Shixing Yu · Tianlong Chen · Jiayi Shen · Huan Yuan · Jianchao Tan · Sen Yang · Ji Liu · Zhangyang Wang