When a model learns training data too closely, including its noise, and consequently performs poorly on new, unseen data.
Friendly Description: Overfitting is when a model memorizes its training data instead of really learning from it. Imagine a student who memorizes the answers to last year's test word for word. They might ace that test, but the moment the questions change, they're stuck. An overfit AI works the same way: great on the practice problems, surprisingly bad on real ones.
Example: If you train a model on photos of dogs that were all taken in sunny parks, it might overfit and start thinking that grass and sunshine are part of being a dog. Show it a photo of a dog indoors at night and it might get confused. Good training techniques keep the model focused on what really matters.