Zubnet AILearnWiki › Overfitting
Training

Overfitting

When a model memorizes its training data too well and loses the ability to generalize to new inputs. Like a student who memorizes answers to practice tests but can't solve new problems. The model performs great on training data but poorly on anything it hasn't seen before.

Why it matters

Overfitting is the most common failure mode in model training. It's why evaluation uses separate test sets, and why training for too long (too many epochs) can actually make a model worse.

Related Concepts

← All Terms
← Optimization Parameters →
ESC