Skip to yearly menu bar Skip to main content


Poster

Approximate Nearest Neighbor Negative Contrastive Learning for Dense Text Retrieval

Lee Xiong · Chenyan Xiong · Ye Li · Kwok-Fung Tang · Jialin Liu · Paul N Bennett · Junaid Ahmed · Arnold Overwijk

Keywords: [ Neural IR ] [ text representation ] [ Text Retrieval ] [ Dense Retrieval ]


Abstract:

Conducting text retrieval in a learned dense representation space has many intriguing advantages. Yet dense retrieval (DR) often underperforms word-based sparse retrieval. In this paper, we first theoretically show the bottleneck of dense retrieval is the domination of uninformative negatives sampled in mini-batch training, which yield diminishing gradient norms, large gradient variances, and slow convergence. We then propose Approximate nearest neighbor Negative Contrastive Learning (ANCE), which selects hard training negatives globally from the entire corpus. Our experiments demonstrate the effectiveness of ANCE on web search, question answering, and in a commercial search engine, showing ANCE dot-product retrieval nearly matches the accuracy of BERT-based cascade IR pipeline. We also empirically validate our theory that negative sampling with ANCE better approximates the oracle importance sampling procedure and improves learning convergence.

Chat is not available.