Jump to content

On local and global sigma-pi neural networks a common link

Fast facts

  • Internal authorship

  • Publishment

    • 1994
  • Journal

    Advances in Computational Mathematics (4)

  • Organizational unit

  • Subjects

    • Applied mathematics
  • Publication format

    Journal article (Article)

Quote

B. Lenze, "On local and global sigma-pi neural networks a common link," Advances in Computational Mathematics, vol. 2, no. 4, pp. 479-491, 1994.

Content

In general, there is a great difference between usual three-layer feedforward neural networks with local basis functions in the hidden processing elements and those with standard sigmoidal transfer functions (in the following often called global basis functions). The reason for this difference in nature can be seen in the ridge-type arguments which are commonly used. It is the aim of this paper to show that the situation completely changes when instead of ridge-type arguments were so-called hyperbolic-type arguments. In detail, we show that usual sigmoidal transfer functions evaluated at hyperbolic-type arguments-usually called sigma-pi units-can be used to construct local basis functions which vanish at infinity and, moreover, are integrable and give rise to a partition of unity, both in Cauchy's principal value sense. At this point, standard strategies for approximation with local basis functions can be used without giving up the concept of non-local sigmoidal transfer functions.

Notes and references

This site uses cookies to ensure the functionality of the website and to collect statistical data. You can object to the statistical collection via the data protection settings (opt-out).

Settings(Opens in a new tab)