Relative entropy (Kullback-Leibler divergence) is among the fundamental concepts of information theory and statistics. I will introduce the category FinStat in which Bayesian inference takes place, and explain how relative entropy is a functor on FinStat. This observation can be used to characterize relative entropy uniquely through simple properties. Finally, I will speculate on how one can try to derive the fact that relative entropy takes values in the real numbers in terms of a localization of FinStat.
Based on joint work with John Baez.
Authors
Algebraic Topology and Information Theory
Keywords
Tags: bayesian inference, kullback-leibler divergence, relative entropy
Photos by : ASA Goddard Space Flight Center