Doing More with Less: Computational Role of Information Structure in Neural Networks based on Entropy Maximization
Résumé
We propose a bio-inspired concept based on the maximization of entropy in neural networks for memory storage and higher-order cognitive skills. We emphasize the role of information structure in mapping high-resolution inputs onto extremely low-resolution neurons. Despite the unreliability of neurons due to intrinsic noise and limitations, their interaction allows error-free reconstruction. In particular, we show that the necessary number of neurons for reconstruction grows linearly while the resolution of the input grows exponentially. Playing with the information structure of neurons, we can make them sensitive to symbolic information in signals, like hierarchical binary trees or the relative order of elements in sequences. These features are a hallmark of symbolic systems and of higher-order cognitive skills.
Domaines
Intelligence artificielle [cs.AI]Origine | Fichiers produits par l'(les) auteur(s) |
---|---|
licence |