Options
Comparing information-theoretic measures of complexity in Boltzmann machines
Publikationstyp
Journal Article
Date Issued
2017-07-03
Sprache
English
Author(s)
Journal
Volume
19
Issue
7
Start Page
310
Article Number
310
Citation
Entropy 19 (7): 310 (2017)
Publisher DOI
Scopus ID
ArXiv ID
Publisher
MDPI
In the past three decades, many theoretical measures of complexity have been proposed to help understand complex systems. In this work, for the first time, we place these measures on a level playing field, to explore the qualitative similarities and differences between them, and their shortcomings. Specifically, using the Boltzmann machine architecture (a fully connected recurrent neural network) with uniformly distributed weights as our model of study, we numerically measure how complexity changes as a function of network dynamics and network parameters. We apply an extension of one such information-theoretic measure of complexity to understand incremental Hebbian learning in Hopfield networks, a fully recurrent architecture model of autoassociative memory. In the course of Hebbian learning, the total information flow reflects a natural upward trend in complexity as the network attempts to learn more and more patterns.
Subjects
Boltzmann machine
Complexity
Hebbian learning
Hopfield network
Information geometry
Information integration
Computer Science - Information Theory
Computer Science - Information Theory
Computer Science - Neural and Evolutionary Computing
Mathematics - Information Theory
Quantitative Biology - Neurons and Cognition
DDC Class
004: Informatik
510: Mathematik
600: Technik