Skip to yearly menu bar Skip to main content


Poster

Hyperbolic Attention Networks

Caglar Gulcehre · Misha Denil · Mateusz Malinowski · Ali Razavi · Razvan Pascanu · Karl Moritz Hermann · Victor Bapst · Victor Bapst · Adam Santoro · Nando de Freitas

Great Hall BC #44

Keywords: [ hyperbolic geometry ] [ attention methods ] [ reasoning on graphs ] [ relation learning ] [ scale free graphs ] [ transformers ] [ power law ]


Abstract:

Recent approaches have successfully demonstrated the benefits of learning the parameters of shallow networks in hyperbolic space. We extend this line of work by imposing hyperbolic geometry on the embeddings used to compute the ubiquitous attention mechanisms for different neural networks architectures. By only changing the geometry of embedding of object representations, we can use the embedding space more efficiently without increasing the number of parameters of the model. Mainly as the number of objects grows exponentially for any semantic distance from the query, hyperbolic geometry --as opposed to Euclidean geometry-- can encode those objects without having any interference. Our method shows improvements in generalization on neural machine translation on WMT'14 (English to German), learning on graphs (both on synthetic and real-world graph tasks) and visual question answering (CLEVR) tasks while keeping the neural representations compact.

Live content is unavailable. Log in and register to view live content