Jump to content

Improving Leung's bidirectional learning rule for associative memories

Fast facts

  • Internal authorship

  • Publishment

    • 2001
  • Journal

    IEEE Transactions on Neural Networks (5)

  • Organizational unit

  • Subjects

    • Applied computer science
  • Publication format

    Journal article (Article)

Quote

B. Lenze, "Improving Leung's bidirectional learning rule for associative memories," IEEE Transactions on Neural Networks, vol. 12, no. 5, pp. 1222-1226, 2001.

Content

Leung (1994) introduced a perceptron-like learning rule to enhance the recall performance of bidirectional associative memories (BAMs). He proved that his so-called bidirectional learning scheme always yields a solution within a finite number of learning iterations in case that a solution exists. Unfortunately, in the setting of Leung a solution only exists in case that the training set is strongly linearly separable by hyperplanes through the origin. We extend Leung's approach by considering conditionally strong linear separable sets allowing separating hyperplanes not containing the origin. Furthermore, we deal with BAMs, which are generalized by defining so-called dilation and translation parameters enlarging their capacity, while leaving their complexity almost unaffected. The whole approach leads to a generalized bidirectional learning rule which generates BAMs with dilation and translation that perform perfectly on the training set in a case that the latter satisfies the conditionally strong linear separability assumption. Therefore, in the sense of Leung, we conclude with an optimal learning strategy which contains Leung's initial idea as a special case.

Notes and references

This site uses cookies to ensure the functionality of the website and to collect statistical data. You can object to the statistical collection via the data protection settings (opt-out).

Settings(Opens in a new tab)