全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...
Tecnura  2012 

Selección eficiente de arquitecturas neuronales empleando técnicas destructivas y de regularización

Keywords: algorithm back-propagation, neural networks, regularization, pruning techniques.

Full-Text   Cite this paper   Add to My Lib

Abstract:

this article shows a detailed comparison of both theoretical and practical ontogenetic neural networks obtained through pruning and regularization algorithms. we initially deal with the concept of a regularized error function and the different ways to modify such a function (weight decay (wd), soft weight sharing, and chauvin penalty). then some of the most representative pruning algorithms are considered, particularly the obd (optimal brain damage) algorithm. we select obd and wd within the problem of the xor function with the purpose of analyzing pruning techniques and regularization algorithms. the basic back-propagation algorithm is used in both wd and the inverse hessian matrix in obd. according to the results, wd is faster than obd, but it deletes a smaller number of weights. additionally, obd reduces the complexity of the neural-network architecture, but its computational cost is still high.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133