全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

Dither is Better than Dropout for Regularising Deep Neural Networks

Full-Text   Cite this paper   Add to My Lib

Abstract:

Regularisation of deep neural networks (DNN) during training is critical to performance. By far the most popular method is known as dropout. Here, cast through the prism of signal processing theory, we compare and contrast the regularisation effects of dropout with those of dither. We illustrate some serious inherent limitations of dropout and demonstrate that dither provides a more effective regulariser.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133