Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Bridging the Gap Between Practice and Theory in Deep Learning

Bridging Empirics and Theory: Unveiling Asymptotic Universality Across Gaussian and Gaussian Mixture Inputs in Deep Learning

Jaeyong Bae · Hawoong Jeong


Abstract:

This research extends theoretical understanding in deep learning by examining the behavior of neural networks when faced with Gaussian Mixture (GM) structured inputs. Diverging from the conventional use of simple Gaussian inputs, our investigation reveals that the dynamics of neural networks demonstrate a compelling convergence towards them with GM inputs, which are more reflective of real-world complexities. This convergence underlines a previously unidentified universality in neural network behavior, suggesting that even with the complex distributions presented by GMs, neural networks exhibit predictable asymptotic behaviors consistent with conventional theoretical predictions. This revelation enriches our theoretical foundation, suggesting that neural network predictions remain robust across varied input distributions.

Chat is not available.