.

Technology

Dropout

Dropout prevents neural network overfitting by randomly deactivating neurons during training to ensure robust feature learning.

Introduced by Hinton and the Google Brain team in 2012, Dropout solves the co-adaptation problem in deep neural networks. By randomly dropping units (along with their connections) with a typical probability of 0.5 during training, the technique forces the network to learn redundant representations rather than relying on specific high-weight paths. This stochastic regularization effectively approximates an ensemble of 2^n architectures, significantly lowering generalization error on benchmarks like ImageNet and MNIST. It remains a standard architectural component in frameworks like PyTorch and TensorFlow for stabilizing large-scale models.

https://arxiv.org/abs/1207.0580
1 project ยท 1 city

Related technologies

Recent Talks & Demos

Showing 1-1 of 1

Members-Only

Sign in to see who built these projects