DeepLearning2.6, Vanishing Gradient Problem
From Wulfram Gerstner
views
comments
From Wulfram Gerstner
The vanishing gradient problem appears when BackProp is applied to a deep network with multiple layers: most 'signaling paths' have a gradient equal or close to zero.
EPFL video portal by SWITCH | Terms of service | Disclaimer | EPFL Privacy policy |