Mean Absolute Error (MAE)

Go back to the [[AI Glossary]]

An error metric calculated by taking an average of absolute errors. In the context of evaluating a model’s accuracy, MAE is the average absolute difference between the expected and predicted values across all training examples. Specifically, for n examples, for each value y and its prediction y-hat, MAE is defined as follows:

The equation for Mean Absolute Error

Receiving pushes... (requires JavaScript)
Loading context... (requires JavaScript)
πŸ›οΈ Stoas for [[mean_absolute_error_(mae)]]
πŸ“– Open document (Hedgedoc) at https://doc.anagora.org/mean_absolute_error_(mae)
πŸ“– Open document (Etherpad) at https://stoa.anagora.org/p/mean_absolute_error_(mae)