arXiv Analytics

Sign in

arXiv:1901.08987 [cs.LG]AbstractReferencesReviewsResources

Dynamical Isometry and a Mean Field Theory of LSTMs and GRUs

Dar Gilboa, Bo Chang, Minmin Chen, Greg Yang, Samuel S. Schoenholz, Ed H. Chi, Jeffrey Pennington

Published 2019-01-25Version 1

Training recurrent neural networks (RNNs) on long sequence tasks is plagued with difficulties arising from the exponential explosion or vanishing of signals as they propagate forward or backward through the network. Many techniques have been proposed to ameliorate these issues, including various algorithmic and architectural modifications. Two of the most successful RNN architectures, the LSTM and the GRU, do exhibit modest improvements over vanilla RNN cells, but they still suffer from instabilities when trained on very long sequences. In this work, we develop a mean field theory of signal propagation in LSTMs and GRUs that enables us to calculate the time scales for signal propagation as well as the spectral properties of the state-to-state Jacobians. By optimizing these quantities in terms of the initialization hyperparameters, we derive a novel initialization scheme that eliminates or reduces training instabilities. We demonstrate the efficacy of our initialization scheme on multiple sequence tasks, on which it enables successful training while a standard initialization either fails completely or is orders of magnitude slower. We also observe a beneficial effect on generalization performance using this new initialization.

Related articles: Most relevant | Search more
arXiv:1912.09132 [cs.LG] (Published 2019-12-19)
Mean field theory for deep dropout networks: digging up gradient backpropagation deeply
arXiv:2202.11364 [cs.LG] (Published 2022-02-23)
FastRPB: a Scalable Relative Positional Encoding for Long Sequence Tasks
arXiv:2306.15368 [cs.LG] (Published 2023-06-27)
Mean Field Theory in Deep Metric Learning