Skip to yearly menu bar Skip to main content


In-Person Poster presentation / poster accept

Bispectral Neural Networks

Sophia Sanborn · Christian Shewmake · Bruno Olshausen · Christopher Hillar

MH1-2-3-4 #90

Keywords: [ Deep Learning and representational learning ] [ Representation Theory ] [ representation learning ] [ group theory ] [ invariance ] [ symmetry ] [ geometry ]


Abstract:

We present a neural network architecture, Bispectral Neural Networks (BNNs) for learning representations that are invariant to the actions of compact commutative groups on the space over which a signal is defined. The model incorporates the ansatz of the bispectrum, an analytically defined group invariant that is complete -- that is, it preserves all signal structure while removing only the variation due to group actions. Here, we demonstrate that BNNs are able to simultaneously learn groups, their irreducible representations, and corresponding equivariant and complete-invariant maps purely from the symmetries implicit in data. Further, we demonstrate that the completeness property endows these networks with strong invariance-based adversarial robustness. This work establishes Bispectral Neural Networks as a powerful computational primitive for robust invariant representation learning.

Chat is not available.