Distillation

Level 4

Short Description

A technique where a smaller "student" model is trained to mimic the outputs of a larger "teacher" model, producing a faster, cheaper version with similar quality.

Friendly Description: Distillation in AI is like having a master chef teach a young apprentice their best recipes. The big, expensive AI model (the teacher) shares what it knows with a smaller, faster model (the student). The student learns to mimic the teacher's answers and ends up much cheaper to run, while still being surprisingly capable.

Example: A company might have a giant AI model that gives wonderful answers but takes a few seconds and a lot of money per question. Through distillation, they train a smaller version that runs on a phone, gives nearly-as-good answers, and responds instantly, perfect for everyday users.