全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...
Statistics  2015 

A Theoretically Grounded Application of Dropout in Recurrent Neural Networks

Full-Text   Cite this paper   Add to My Lib

Abstract:

A long strand of empirical research has claimed that dropout cannot be applied between the recurrent connections of a recurrent neural network (RNN). The reasoning has been that the noise hinders the network's ability to model sequences, and instead should be applied to the RNN's inputs and outputs alone. But dropout is a vital tool for regularisation, and without dropout in recurrent layers our models overfit quickly. In this paper we show that a recently developed theoretical framework, casting dropout as approximate Bayesian inference, can give us mathematically grounded tools to apply dropout within the recurrent layers. We apply our new dropout technique in long short-term memory (LSTM) networks and show that the new approach significantly outperforms existing techniques.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133