Skip to yearly menu bar Skip to main content


Poster
in
Affinity Workshop: Tiny Papers Poster Session 4

When Does Second-Order Optimization Speed Up Training?

Satoki Ishikawa · Rio Yokota

Halle B #304
[ ] [ Project Page ]
Wed 8 May 7:30 a.m. PDT — 9:30 a.m. PDT

Abstract:

While numerous second-order optimization methods have been proposed to accelerate training in deep learning, they are seldom used in practice. This is partly due to a limited understanding of the conditions under which second-order optimization outperforms first-order optimization. This study aims to identify these conditions, particularly in terms of batch size and dataset size.We find empirically that second-order optimization outperforms first-order optimization when the batch size is large and the data set size is not too large.

Live content is unavailable. Log in and register to view live content