artificial neural network

Technology Dictionary -> artificial neural network

artificial neural network



(ANN, commonly just "neural network" or "neural net") A network of many very simple processors ("units" or "neurons"), each possibly having a (small amount of) local memory. The units are connected by unidirectional communication channels ("connections"), which carry numeric (as opposed to symbolic) data. The units operate only on their local data and on the inputs they receive via the connections.

A neural network is a processing device, either an algorithm, or actual hardware, whose design was inspired by the design and functioning of animal brains and components thereof.

Most neural networks have some sort of "training" rule whereby the weights of connections are adjusted on the basis of presented patterns. In other words, neural networks "learn" from examples, just like children learn to recognise dogs from examples of dogs, and exhibit some structural capability for generalisation.

Neurons are often elementary non-linear signal processors (in the limit they are simple threshold discriminators). Another feature of NNs which distinguishes them from other computing devices is a high degree of interconnection which allows a high degree of parallelism. Further, there is no idle memory containing data and programs, but rather each neuron is pre-programmed and continuously active.

The term "neural net" should logically, but in common usage never does, also include biological neural networks, whose elementary structures are far more complicated than the mathematical models used for ANNs.

See Aspirin, Hopfield network, McCulloch-Pitts neuron.

Usenet newsgroup: news:comp.ai.neural-nets.

(1997-10-13)


© Art Branch Inc.

SQL Tutorial