THE SUPER-TURING COMPUTATIONAL POWER OF PLASTIC RECURRENT NEURAL NETWORKS
Turing
Computational model
DOI:
10.1142/s0129065714500294
Publication Date:
2014-09-16T04:46:52Z
AUTHORS (2)
ABSTRACT
We study the computational capabilities of a biologically inspired neural model where synaptic weights, connectivity pattern, and number neurons can evolve over time rather than stay static. Our focuses on mere concept plasticity so that nature updates is assumed to be not constrained. In this context, we show so-called plastic recurrent networks (RNNs) are capable precise super-Turing power--as static analog networks--irrespective whether their weights modeled by rational or real numbers, moreover, irrespective patterns restricted bi-valued expressed any other more general form updating. Consequently, incorporation only in basic RNNs suffices break Turing barrier achieve level computation. The consideration mechanisms architectural does further increase networks. These results support claim mechanism crucially involved dynamical biological They computation reflects suitable way brain-like models
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (47)
CITATIONS (50)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....