Relative entropy (Kullback-Leibler divergence) is among the fundamental concepts of information theory and statistics. I will introduce the category FinStat in which Bayesian inference takes place, and explain how relative entropy is a functor on FinStat. This observation can be used to characterize relative entropy uniquely through simple properties. Finally, I will speculate on how one can try to derive the fact that relative entropy takes values in the real numbers in terms of a localization of FinStat.
Based on joint work with John Baez.


Tobias Fritz

Algebraic Topology and Information Theory