Learning Rate

Level 4

Short Description

A hyperparameter that controls how much model weights change in response to error during training.

Friendly Description: The learning rate is how big a step the AI takes each time it tries to learn from a mistake. Take steps that are too tiny, and learning crawls along forever. Take steps that are too big, and the model overshoots and never settles. Picking just the right learning rate is one of the most important little decisions in training a model.

Example: Imagine you're trying to find the lowest point in a valley while blindfolded. A small learning rate is shuffling forward an inch at a time, safe but slow. A large learning rate is taking giant leaps, fast but risky. Most training uses a moderate rate, sometimes shrinking it as the model gets closer to the answer.