Zitat
B. Lenze, “Mathematics and Neural Networks - A Glance at some Basic Connections,” Acta Applicandae Mathematicae, vol. 55, pp. 303–311, 1999.
Abstract
In the following paper, we present a brief and easily accessible introduction to the theory of neural networks under special emphasis on the rôle of pure and applied mathematics in this interesting field of research. In order to allow a quick and direct approach even for nonspecialists, we only consider three-layer feedforward networks with sigmoidal transfer functions and do not cover general multi-layer, recursive or radial-basis-function networks. Moreover, we focus our attention on density and complexity results while construction problems based on operator techniques are not discussed in detail. Especially, in connection with complexity results, we show that neural networks in general have the power to approximate certain function spaces with a minimal number of free parameters. In other words, under this specific point of view neural networks represent one of the best possible approximation devices available. Besides pointing out this remarkable fact, the main motivation for presenting this paper is to give some more mathematicians an idea of what is going on in the theory of neural networks and, perhaps, to encourage, at least a few of them, to start working in this highly interdisciplinary and promising field, too.