πŸ“• Node [[exploding_gradient_problem]]
πŸ“„ Exploding_Gradient_Problem.md by @KGBicheno

exploding gradient problem

Go back to the [[AI Glossary]]

#seq

The tendency for gradients in a deep neural networks (especially recurrent neural networks) to become surprisingly steep (high). Steep gradients result in very large updates to the weights of each node in a deep neural network.

Models suffering from the exploding gradient problem become difficult or impossible to train. Gradient clipping can mitigate this problem.

Compare to vanishing gradient problem.

F

Loading pushes...

Rendering context...