Information storage in sparsely coded memory nets

03 medical and health sciences 0302 clinical medicine
DOI: 10.1088/0954-898x_1_1_005 Publication Date: 2015-08-18T20:03:58Z
ABSTRACT
We study simple, feedforward, neural networks for pattern storage and retrieval, with information theory criteria. Two Hebbian learning rules are considered, with emphasis on sparsely coded patterns. We address the question: under which conditions is the optimal information storage reached in the error-full regime?For the model introduced some time ago by Willshaw, Buneman and Longuet-Higgins, the information stored goes through a maximum, which may be found within the error-less or the error-full regimes according to the value of the coding rate. However, it eventually vanishes as learning goes on and more patterns are stored.For the original Hebb learning rule, where reinforcement occurs whenever both input and output neurons are active, the information stored reaches a stationary value, 1/(π ln 2), when the net is overloaded beyond its threshold for errors. If the coding rate f′ of the output pattern is small enough, the information storage goes through a maximum, which saturates the Gardner bound, 1/(...
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (19)
CITATIONS (57)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....