Skip to yearly menu bar Skip to main content


Poster

Effective post-training embedding compression via temperature control in contrastive training

georgiana dinu · Corey Barrett · Yi Xiang · Miguel Romero Calvo · Anna Currey · Xing Niu

Hall 3 + Hall 2B #595
[ ]
Sat 26 Apr midnight PDT — 2:30 a.m. PDT

Abstract:

Fixed-size learned representations (dense representations, or embeddings) are widely used in many machine learning applications across language, vision or speech modalities. This paper investigates the role of the temperature parameter in contrastive training for text embeddings. We shed light on the impact this parameter has on the intrinsic dimensionality of the embedding spaces obtained, and show that lower intrinsic dimensionality is further correlated with effective compression of embeddings. We still observe a trade-off between absolute performance and effective compression and we propose temperature aggregation methods which reduce embedding size by an order of magnitude with minimal impact on quality.

Live content is unavailable. Log in and register to view live content