Tutorials

 
Modern Probability Theory Kevin H. Knuth, Departments of Physics and Informatics, University at Albany (SUNY), Albany NY, USA

A theory of logical inference should be all-encompassing, applying to any subject about which inferences are to be made.  This includes problems ranging from the early applications of games of chance, to modern applications involving astronomy, biology, chemistry, geology, jurisprudence, physics, signal processing, sociology, and even quantum mechanics.  This paper focuses on how the theory of inference has evolved in recent history: expanding in scope, solidifying its foundations, deepening its insights, and growing in calculational power.

Biography: Kevin H. Knuth is an Associate Professor in the Departments of Physics and Informatics at the University at Albany (SUNY). He is Editor-in-Chief of the journal Entropy, the co-founder and President of a robotics company, Autonomous Exploration Inc., and a former NASA research scientist. He has 20 years of experience in designing Bayesian and maximum entropy-based machine learning algorithms for data analysis applied to the physical sciences. His interests in the foundations of inference and inquiry has led his theoretical investigations to focus on the foundations of physics from an information-based perspective. knuthlab.rit.albany.edu/

Basics of Information Geometry Ariel Caticha, Department of Physics, University at Albany, NY 12222, USA
A main concern of any theory of inference is to pick a probability distribution from a set of candidates and this immediately raises many questions. What if we had picked a neighboring distribution? What difference would it make? What makes two distributions similar? To what extent can we distinguish one distribution from another? Are there quantitative measures of distinguishability? The goal of this tutorial is to address such questions by introducing methods of geometry. More specifically the goal will be to introduce a notion of “distance” between two probability distributions.
A parametric family of probability distributions forms a statistical manifold, namely, a space in which each point represents a probability distribution. Generic manifolds do not come with a pre-installed notion of distance; such additional structure has to be purchased separately in the form of a metric tensor. Statistical manifolds are, however, an exception: a theorem due to N. Čencov (1981) states that up to an overall scale factor there is only one metric that takes into account the fact that these are not distances between simple structureless dots but distances between probability distributions.
To educate our intuition I will briefly sketch a couple of derivations of the information metric and provide a couple of examples. I will not develop the subject in all its possibilities but I will emphasize one specific result. Having a notion of distance means we have a notion of volume and this in turn implies that there is a unique and objective notion of a distribution that is uniform over the space of parameters—equal volumes are assigned equal probabilities. Whether such uniform distributions are maximally non-informative, or whether they define ignorance, or whether they reflect the actual prior beliefs of any rational agent, are all important issues but they are quite beside the specific point to be made here: that these distributions are uniform and this is not a matter of subjective judgment but of objective mathematical proof.

Biography: Ariel Caticha is professor of physics at the University at Albany–SUNY. In recent years his research has focused on the connection between physics and information. One goal has been to develop an entropic inference framework that addresses the central issues—the nature of information and how it ought to be processed through Bayesian, entropic, and geometric methods—in a unified manner. The other goal has been to derive the laws of physics as examples of entropic inference. In this view the laws of physics do not directly reflect nature itself. They are not “Laws of Nature”. Instead, the laws of physics reflect our schemes for processing information about nature. Caticha’s papers on entropic inference and on its applications to the foundations of statistical mechanics, quantum mechanics, and general relativity can be found at http://www.albany.edu/physics/acaticha.shtml


Voronoi diagrams in information geometry Franck Nielsen, Ecole polytechnique, France
We consider the Voronoi diagrams of finite sets of statistical parametric distributions induced by statistical distortion measures. When choosing the Fisher-Rao Riemannian metric distance, we end up with hyperbolic Voronoi diagrams [1] for location-scale families including the normal distributions:
This yields the Riemannian framework of statistical geometry. When selecting the Kullback-Leibler asymmetric divergence, the diagrams amount to Bregman Voronoi diagrams [2] for exponential families:
This yields the dually flat geometry of statistical geometry where we unravel the doubly dual orthogonal Voronoi/regular triangulations.
Applications to computing the Chernoff information [3] and hypothesis  testing [4] are presented.
References:
-1. Hyperbolic Voronoi Diagrams Made Easy. IEEE ICCSA 2010 Visualizing hyperbolic Voronoi diagrams: http://www.youtube.com/watch?v=i9IUzNxeH4o
-2. Bregman Voronoi Diagrams. Discrete & Computational Geometry 44(2):281-307 (2010)
Visualizing Bregman Voronoi diagrams:http://www.youtube.com/watch?v=RJiVKflV6Lo
-3. An Information-Geometric Characterization of Chernoff Information. IEEE Signal Process. Lett. 20(3): 269-272 (2013)
-4. Hypothesis Testing, Information Divergence and Computational Geometry. GSI 2013: 241-248

Biography: Frank Nielsen received his PhD thesis on adaptive computational geometry in 1996 from the university of Nice, France.
After serving in the french army in 1997, he joined the Sony Computer Science Laboratories, Japan. He became professor of computer science at Ecole Polytechnique in 2008. His research interests concentrates on computational information geometry
for imaging and learning.


Foundations and Geometry John Skilling
[abstract]

Uncertainty quantification for computer model Udo V. Toussaint, Max Planck Institut, Germany
The quantification of uncertainty for complex simulations is of increasing importance as well as a significant challenge. Bayesian and non-Bayesian probabilistic uncertainty quantification methods like polynomial chaos (PC) expansion methods or Gaussian processes have found increasing use over the recent years. This contribution describes the use of Gaussian processes and collocation methods for the propagation of uncertainty in computational models using illustrative examples as well as real-world problems. In addition the existing challenges like phase-transitions are outlined.
Bayesian and Information Geometry in signal processing

Biography: Udo v. Toussaint received his PhD thesis on Bayesian inference in ion-beam techniques in 2000 from the university of Bayreuth, Germany. Subsequently he worked with Peter Cheeseman at NASA Ames research center, Mountain View, CA, USA on computer vision and data fusion problems. In 2002 he joined the Max-Planck-Institut for Plasmaphysics, focussing on atomistic modelling and inverse problems. He became a lecturer at the TU Graz, Austria in 2012 and heads a research group at IPP since 2013.


Koszul Information Geometry and Souriau Lie Group Thermodynamics Frédéric Barbaresco, Thales Air Systems, Advanced Radar Concepts Business Unit, France
The Koszul-Vinberg Characteristic Function (KVCF) is a dense knot in important mathematical fields such as Hessian Geometry, Kählerian Geometry, Affine Differential Geometry. This paper develops KVCF as the foundation of Information Geometry, transverse concept in Thermodynamics, in Statistical Physics and in Probability. From general KVCF definition, the paper defines Koszul Entropy, that coincides with the Legendre transform of minus the logarithm of KVCF (their gradients defining mutually inverse diffeomorphisms). These dual functions are compared by analogy in thermodynamic with dual Massieu-Duhem potentials. Hessian of minus the KVCF logarithm provides a non-arbitrary Riemannian metric for Information Geometry. We will observe the fundamental property that barycenter of Koszul Entropy is equal to Koszul entropy of barycenter. We present then a generalization of the characteristic function by physicist Jean-Marie Souriau in statistical physics, introducing the concept of co-adjoint action of a group on its momentum space, defining physical observables like energy, heat and momentum as pure geometrical objects. We will compare moment map with the dual coordinate in Koszul model (barycenter where entropy is maximum) and give a vector valued definition of Maximum Entropy. In covariant Souriau model, Gibbs equilibriums states are indexed by a geometric parameter, the Geometric Temperature, with values in the Lie algebra of the dynamical group, interpreted as a space-time vector (a vector valued temperature of Planck), giving to the metric tensor a null Lie derivative. Information Fisher metric appears as the opposite of the derivative of Moment map by geometric temperature, equivalent to a Geometric Capacity. We will synthetize the analogies between Koszul and Souriau models where Information Geometry is considered as a particular case of Koszul Hessian Geometry. Information Geometry metric is then characterized by invariances, by automorphisms of the convex cone or by Dynamic groups. We conclude, interpreting Legendre transform as Fourier transform in (Min,+) algebra, with new definition of Entropy given by the following relation:   Entropy  = - Fourier(Min,+) ° Log ° Laplace(+,X) .
References:
[1] Barbaresco F., Information Geometry of Covariance Matrix: Cartan-Siegel Homogeneous Bounded Domains, Mostow/Berger Fibration and Fréchet Median, Matrix Information Geom-etry (R. Bhatia and F. Nielsen, Eds.), pp.199-256, Springer, 2012
[2] Barbaresco F., Eidetic Reduction of Information Geometry through Legendre Duality of Koszul Characteristic Function and Entropy: from Massieu-Duhem Potentials to Geometric Souriau Temperature and Balian Quantum Fisher Metric, Geometric Theory of Information, Springer, 2014
[3] Barbaresco F., Koszul Information Geometry and Souriau Geometric Temperature/Capacity of Lie Group Thermodynamics, MDPI Entropy, n°16, special issue on Information Geometry, 2014 

Biography: Frédéric Barbaresco received the State Engineering degree from the French Grand  École SUPELEC, Paris, France, in 1991. Since then, he began to work for THALES Group, where he is now Senior Scientist and Advanced Studies Manager in Advanced Radar Concept Department of Surface Radar Domain of THALES AIR SYSTEMS. He was awarded Aymé Poirson Prize by French Academy of Science in 2014. He is IEEE AES member. He has been an Emeritus Member of SEE since 2011 and he was awarded the Ampere Medal by SEE in 2007 and NATO SET Lecture Award in 2012 for his NATO Lecture from 2008 to 2011 in Europe and North America on Waveform Diversity and Design for Advanced Radar Systems. He has launched in 2009, the Leon Brillouin Seminar on Geometric Sciences of Information hosted by IRCAM and IHP in Paris, France. He has organized Special Sessions on Information Geometry in radar conferences: IRS’11 and IRS’13, EURAD’12. He will gave a Lecture for International Radar’’14 conference on “Modern Radar Processing based on Geometry of Structured Matrices & Information Geometry”. He is a coordinator of Springer Lecture Notes on “Matrix Information Geometry” published in 2012. He was an invited lecturer for UNESCO “Advanced School and Workshop on Matrix Geometries and Applications” in Trieste in June 2013. He is the General Chairman of the new international conference GSI “Geometric Sciences of Information: Information Geometry Manifolds and their Advanced Applications”, first edition GSI’13 at École des Mines in Paris during summer 2013. He is editor of “Geometric Science of Information” book published by SPRINGER in 2013, and one contributors of the SPRINGER book “Geometric Theory of Information” published in 2014. He is co-editor for MDPI “Entropy” journal special session on “Information, Entropy and their Geometric Structures”. His research interests mainly focus on geometric science of information for robust radar processing. He is author or coauthor of over 90 papers and an inventor of over 15 patents for THALES GROUP.

Groupes / audience: