Options
Maximal information divergence from statistical models defined by neural networks
Publikationstyp
Conference Paper
Publikationsdatum
2013-08
Sprache
English
First published in
Number in series
8085 LNCS
Start Page
759
End Page
766
Citation
Lecture Notes in Computer Science 8085 LNCS: 759-766 (2013-10-08)
Contribution to Conference
Publisher DOI
Scopus ID
Publisher
Springer
We review recent results about the maximal values of the Kullback-Leibler information divergence from statistical models defined by neural networks, including naïve Bayes models, restricted Boltzmann machines, deep belief networks, and various classes of exponential families. We illustrate approaches to compute the maximal divergence from a given model starting from simple sub- or super-models. We give a new result for deep and narrow belief networks with finite-valued units.
Schlagworte
exponential family
Kullback-Leibler divergence
multi-information
neural network
DDC Class
004: Informatik