Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Bridging the Gap Between Practice and Theory in Deep Learning

ResNet-Induced Convex Integration Model: a Novel Robust Framework through the Regularity of Optimal Transport

Kuo Gai · Sicong Wang · Shihua Zhang


Abstract:

Deep neural networks are known to be vulnerable to small adversarial perturbations to the inputs. Empirical methods such as adversarial training can defend a particular class of attacks but can still be `broken' by stronger attacks. Another approach to constructing the Lipschitz network is certified robust to perturbations, but its expressive ability is not satisfied. To combine the advantages of the two methods, we design a novel two-step model that can fit the training data accurately and preserve the local Lipschitz property simultaneously. We first train ResNet with a regularizer from optimal transport theory and obtain a discrete optimal transport map from data to its features. As the optimal transport map has regularity property, we interpolate the map by solving the convex integration problem to guarantee the local Lipschitz property of the interpolation. Numerical experiments show that this model can outperform other robust models on diverse datasets well-trained by a ResNet with latent dimensionality unchanged.

Chat is not available.