Skip to yearly menu bar Skip to main content


Poster

Exploring The Loss Landscape Of Regularized Neural Networks Via Convex Duality

Sungyoon Kim · Aaron Mishkin · Mert Pilanci

Hall 3 + Hall 2B #350
[ ]
Thu 24 Apr midnight PDT — 2:30 a.m. PDT
 
Oral presentation: Oral Session 1E
Wed 23 Apr 7:30 p.m. PDT — 9 p.m. PDT

Abstract:

We discuss several aspects of the loss landscape of regularized neural networks: the structure of stationary points, connectivity of optimal solutions, path with non-increasing loss to arbitrary global optimum, and the nonuniqueness of optimal solutions, by casting the problem into an equivalent convex problem and considering its dual. Starting from two-layer neural networks with scalar output, we first characterize the solution set of the convex problem using its dual and further characterize all stationary points. With the characterization, we show that the topology of the global optima goes through a phase transition as the width of the network changes, and construct counterexamples where the problem may have a continuum of optimal solutions. Finally, we show that the solution set characterization and connectivity results can be extended to different architectures, including two layer vector-valued neural networks and parallel three-layer neural networks.

Live content is unavailable. Log in and register to view live content