The initial phase of training a foundation model on a large, broad dataset before any task-specific fine-tuning.
Friendly Description: Pre-training is the first big phase of teaching an AI, where it studies enormous amounts of general data to build up broad knowledge. It's like the years of general schooling a person goes through before specializing. After pre-training, the model knows a little about a lot, and is ready to be fine-tuned for specific jobs.
Example: Before being turned into a coding assistant, a language model goes through pre-training on a huge mix of books, websites, and articles. That's where it picks up grammar, facts, reasoning patterns, and even some programming basics, all the foundational knowledge it builds on later.