In this talk I will present the formalism of information topology developed with Daniel Bennequin and explain how it generalises Shannon theory of communication and provides the basement of a mathematical theory of complex systems. Information cohomology is based on information structures that formalises the algebra and geometry of random variables and probability; they provide a way to encode the prior-constraints on the system, the complexes of cochains as the measurable functions, and the site of an information Topos. In degree 1, The coboundary of Hochschild-Cartan-Eleinberg-Maclane gives the chain rule of information as the first cocycle condition. We retrieve uniquely, up to an arbitrary multiplicative constant, the Shannon-Gibbs entropy as the first group-functor of cohomology. It proves the existence of negentropy, and that Shannon’s axiom of non-negativity is independent and undecidable.
The structure of information is two time richer: we define a second classical topological coboundary unravelling the mixed bicomplex structure of information. Mutual informations of even and odd degree are coboundaries for the topological and Hochschild structure respectively. Hence, the bicomplex encodes all the probabilistic independences and dependences in the cocycles and coboundaries of information. Negativity of n-mutual information generalises synergy collective interactions and provide the analogue of homotopical links (Milnor-Massey).
Authors
Algebraic Topology and Information Theory
Keywords
Tags: cohomology, complex systems, entropy, information, links, mutual-information, synergy, topological data analysis
Photos by : ASA Goddard Space Flight Center