Practical Convolutional Neural Networks
上QQ阅读APP看书,第一时间看更新

Dropout

A neural network can be thought of as a search problem. Each node in the neural network is searching for correlation between the input data and the correct output data.

Dropout randomly turns nodes off while forward-propagating and thus helps ward off weights from converging to identical positions. After this is done, it turns on all the nodes and back-propagates. Similarly, we can set some of the layer's values to zero at random during forward propagation in order to perform dropout on a layer.

Use dropout only during training. Do not use it at runtime or on your testing dataset.