In-Person Poster presentation / poster accept

Characterizing the spectrum of the NTK via a power series expansion

Michael Murray · Hui Jin · Benjamin Bowman · Guido Montufar

MH1-2-3-4 #160

Keywords: [ Theory ] [ neural tangent kernel ] [ activation function ] [ Hermite coefficient ] [ input Gram matrix ] [ spectrum ] [ power series ]


Abstract:

Under mild conditions on the network initialization we derive a power series expansion for the Neural Tangent Kernel (NTK) of arbitrarily deep feedforward networks in the infinite width limit. We provide expressions for the coefficients of this power series which depend on both the Hermite coefficients of the activation function as well as the depth of the network. We observe faster decay of the Hermite coefficients leads to faster decay in the NTK coefficients and explore the role of depth. Using this series, first we relate the effective rank of the NTK to the effective rank of the input-data Gram. Second, for data drawn uniformly on the sphere we study the eigenvalues of the NTK, analyzing the impact of the choice of activation function. Finally, for generic data and activation functions with sufficiently fast Hermite coefficient decay, we derive an asymptotic upper bound on the spectrum of the NTK.

Chat is not available.