ANNs are a new type of computer model that combines some of the strengths of biological neural networks with those of other types of computer models, such as those based on the perceptron model and those based on fuzzy logic. 

An ANN is composed of a network of simple neurons connected by weighted edges. The network can be described as a set of data vectors, each corresponding to the data of an input, or input, which are fed through the network. The output, or output, vectors are generated by the network based on the data vectors. 

The networks are trained by giving the network data vectors, and then measuring the output. The training algorithm is used to find an optimal set of weights for the individual neurons, that is, a set of weights that produces an optimal output vector, or weight. This produces an estimate of the weights that produce the observed output vector.

One of their strengths is that they allow for an almost unlimited amount of learning in a single-shot, asynchronous, and fully-connected architecture. This is in contrast to other types of models, such as those based on rules and decision trees that cannot learn much beyond simple linear transformations. 

An ANN is often used for applications in which the input data and the output data need to be transformed (e.g. to a different format), or the data must be transformed with some bias term added to it (e.g. bias-variance transformation). More generally, an ANN is used to generate data that can be used in subsequent machine learning applications.

ANNs have applications in a number of fields, including machine learning, signal processing, speech processing, pattern recognition, image processing, and computer vision. See, for example, H. K. Simon, “Introduction to ANNs”, Neural Networks for Speech Recognition, pp. 1-12 (IEEE, 1993); W. G. Anderson, “The ANN: A Modern Paradigm in Pattern Recognition”, Machine Learning, Vol. 12, No. 1, pp. 1-29 (1993); and A. E. Raoul et al., “An Introduction to Neurons”, Neural Networks; vol. 2.8, pp. 185-198 (1989).