arXiv Analytics

Sign in

arXiv:1901.03909 [stat.ML]AbstractReferencesReviewsResources

Eliminating all bad Local Minima from Loss Landscapes without even adding an Extra Unit

Jascha Sohl-Dickstein, Kenji Kawaguchi

Published 2019-01-12Version 1

Recent work has noted that all bad local minima can be removed from neural network loss landscapes, by adding a single unit with a particular parameterization. We show that the core technique from these papers can be used to remove all bad local minima from any loss landscape, so long as the global minimum has a loss of zero. This procedure does not require the addition of auxiliary units, or even that the loss be associated with a neural network. The method of action involves all bad local minima being converted into bad (non-local) minima at infinity in terms of auxiliary parameters.

Related articles: Most relevant | Search more
arXiv:1810.09038 [stat.ML] (Published 2018-10-21)
Depth with Nonlinearity Creates No Bad Local Minima in ResNets
arXiv:1611.06310 [stat.ML] (Published 2016-11-19)
Local minima in training of deep networks
arXiv:1805.08671 [stat.ML] (Published 2018-05-22)
Adding One Neuron Can Eliminate All Bad Local Minima