πŸ“• Node [[mini batch_stochastic_gradient_descent_(sgd)]]
πŸ“„ Mini-Batch_Stochastic_Gradient_Descent_(Sgd).md by @KGBicheno

mini-batch stochastic gradient descent (SGD)

Go back to the [[AI Glossary]]

A gradient descent algorithm that uses mini-batches. In other words, mini-batch SGD estimates the gradient based on a small subset of the training data. Vanilla SGD uses a mini-batch of size 1.

Loading pushes...

Rendering context...