Skip to yearly menu bar Skip to main content


Invited Talk
in
Workshop: Geometrical and Topological Representation Learning

Roland Kwitt: Topologically Densified Distributions


Abstract:

In this talk, I am going to discuss some recent advances in the context of (topological) regularization for small sample-size learning with overparametrized neural networks. Specifically, I will shift focus from architectural properties, such as norms on the network weights, to properties of the internal representations before a linear classifier. In particular, I will advocate a topological constraint on samples drawn from the probability measure induced in that space. This provably leads to mass concentration effects around the representations of training instances, i.e., a property beneficial for generalization. Importantly, the topological constraints can be imposed in an efficient manner by leveraging results from prior work. A series of experiments on popular (vision) benchmarks provides strong empirical evidence to support the claim for better generalization in the small sample-size regime.

Chat is not available.