Processing math: 100%
Skip to yearly menu bar Skip to main content


Poster

Robust Weight Initialization for Tanh Neural Networks with Fixed Point Analysis

Hyunwoo Lee · Hayoung Choi · Hyunju Kim

Hall 3 + Hall 2B #358
[ ]
Fri 25 Apr midnight PDT — 2:30 a.m. PDT

Abstract: As a neural network's depth increases, it can improve generalization performance. However, training deep networks is challenging due to gradient and signal propagation issues. To address these challenges, extensive theoretical research and various methods have been introduced. Despite these advances, effective weight initialization methods for tanh neural networks remain insufficiently investigated. This paper presents a novel weight initialization method for neural networks with tanh activation function. Based on an analysis of the fixed points of the function tanh(ax), the proposed method aims to determine values of a that mitigate activation saturation. A series of experiments on various classification datasets and physics-informed neural networks demonstrates that the proposed method outperforms Xavier initialization methods (with or without normalization) in terms of robustness across different network sizes, data efficiency, and convergence speed. Code is available at https://github.com/1HyunwooLee/Tanh-Init.

Live content is unavailable. Log in and register to view live content