This document discusses regularization techniques for neural networks to prevent overfitting. It describes how the number of hidden units controls complexity and can lead to underfitting or overfitting. Early stopping is introduced as an alternative to regularization where training is stopped when validation error starts to increase. Consistency of regularization terms under linear transformations
![PRML5 - ニューラルネットワーク](https://cdn-ak-scissors.b.st-hatena.com/image/square/08ddba33a34d0b5fcc6dda2e0c3212c45c14ea8c/height=288;version=1;width=512/https%3A%2F%2Fcdn.slidesharecdn.com%2Fss_thumbnails%2Fprml5-1-100706220740-phpapp01-thumbnail.jpg%3Fwidth%3D640%26height%3D640%26fit%3Dbounds)