Regularization is an umbrella term given to any technique that helps to prevent a neural network from overfitting the training data. This post, available as a PDF below, follows on from my Introduction to Neural Networks and explains what overfitting is, why neural networks are regularized and gives a brief overview of the main techniques available to a network designer. Weight regularization, early stopping, and dropout are discussed in more detail.