全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

Artificial neural networks for diagnosis and survival prediction in colon cancer

DOI: 10.1186/1476-4598-4-29

Keywords: ANN, backpropagation, nodes, perceptron, performance, training, weights

Full-Text   Cite this paper   Add to My Lib

Abstract:

Artificial neural networks (ANNs) are regression devices containing layers of computing nodes (crudely analogous to the mammalian biological neurons) with remarkable information processing characteristics. They are able to detect nonlinearities that are not explicitly formulated as inputs, making them capable of learning and adaptability. They possess high parallelism, robustness, generalization and noise tolerance, which make them capable of clustering, function approximation, forecasting and association, and performing massively parallel multifactorial analyses for modeling complex patterns, where there is little a priori knowledge [1]. Artificial neural models possessing such characteristics are desirable because: (a) nonlinearity allows better fit to the data, (b) noise-insensitivity leads to accurate prediction in the presence of uncertain data and measurement errors, (c) high parallelism implies fast processing and hardware failure-tolerance, (d) learning and adaptability permits the system to update and/or modify its internal structure in response to changing environment, and (e) generalization enables application of the model to unlearned data [2].In the early 1940s, McCulloch and Pitts [3] explored the competitive abilities of networks made up of theoretical mathematical models when applied to the operation of simple artificial neurons. When these early neurons were combined, it was possible to construct networks capable of computing any of the finite basic Boolean logical functions, including symbolic logic. The system comprised of an artificial neuron and input (stimuli) was referred to as "the Perceptron", which established a mapping between input activity and output signal. The next important milestone was the development of the first trainable network perceptron by Rosenblatt, 1959 [4] and Widrow & Hoff, 1960 [5], initially as a linear model having two layers of neurons or nodes (an input and an output layer) and a single layer of interconnections with

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133