skip to content
Ben Lau statistics . machine learning . programming . optimization . research

Dropout

1 min read Updated:

Dropout is a regularization technique that helps to prevent overfitting in neural networks. It works by randomly setting a fraction of the input units to zero at each update during training. This helps to prevent the network from relying too much on any one input unit, and forces it to learn more robust features.

However, it is needed to be used with caution in recurrent neural networks (RNNs) due to the sequential nature of RNNs.