全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

Distilling Word Embeddings: An Encoding Approach

Full-Text   Cite this paper   Add to My Lib

Abstract:

Distilling knowledge from a well-trained cumbersome network to a small one has become a new research topic recently, as lightweight neural networks with high performance are particularly in need in various resource-restricted systems. This paper addresses the problem of distilling embeddings for NLP tasks. We propose an encoding approach to distill task-specific knowledge from high-dimensional embeddings, which can retain high performance and reduce model complexity to a large extent. Experimental results show our method is better than directly training neural networks with small embeddings.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133