A configuration value (such as learning rate or batch size) set before training, as opposed to weights learned during training.
Friendly Description: Hyperparameters are the settings a developer chooses before training an AI, kind of like adjusting the temperature, time, and rack position before baking a cake. The recipe (the model) is the same, but those settings have a big effect on how the final result turns out. Picking the right hyperparameters is part science and part craft.
Example: When training a model, a developer might set a learning rate of 0.001, a batch size of 64, and 20 epochs. Tweaking any of those numbers can change how quickly and how well the model learns, even though the model architecture itself stays the same.