Models For Parallel Processing Artificial Neural Networks

Artificial neural networks (ANNs) are layered, information-processing systems whose designs have distant origin based on real neural structures such as the vertebrate retina. The architecture of ANNs is very similar to the feature extractor systems of Fukushima (1969; 1970) (see Section 7.1.3). That is, there are two or more planes containing signal nodes. Connecting the nodes between the planes are signal paths with numerical weights. In Fukushima's systems, the weights are fixed; in ANNs, the weights can be changed by a training rule to optimize the performance of the ANN. The inspiration for rule-based optimization of weights can be traced to a seminal text by Hebb (1949), The Organization of Behavior: A Neuropsychological Theory. Hebb hypothesized that when neuron A persistently and frequently causes neuron B to fire, the excitatory synaptic connections between A and B are strengthened. Mathematically, Hebb's hypothesis can be approximated by the discrete equation:

where xjk is the jth input at t = kT to the node whose output is y at t = kT, wjk is the weight of the path connecting the jth input to the output node at t = kT, wjk+1 is the new weight, and c is a positive learning rate constant. A more general learning rule can be written (Khanna, 1990):

Was this article helpful?

0 0
Peripheral Neuropathy Natural Treatment Options

Peripheral Neuropathy Natural Treatment Options

This guide will help millions of people understand this condition so that they can take control of their lives and make informed decisions. The ebook covers information on a vast number of different types of neuropathy. In addition, it will be a useful resource for their families, caregivers, and health care providers.

Get My Free Ebook

Post a comment