(2015) GSI2015

Asymptotics of superposition of point processes Aurélien Vasseur, Laurent Decreusefond GSI2015
Détails de l'article
The characteristic independence property of Poisson point processes gives an intuitive way to explain why a sequence of point processes becoming less and less repulsive can converge to a Poisson point process. The aim of this paper is to show this convergence for sequences built by superposing, thinning or rescaling determinantal processes. We use Papangelou intensities and Stein’s method to prove this result with a topology based on total variation distance.
Asymptotics of superposition of point processes
Asymptotic properties of random polytopes Pierre Calka GSI2015
Détails de l'article
Random polytopes have constituted some of the central objects of stochastic geometry for more than 150 years. They are in general generated as convex hulls of a random set of points in the Euclidean space. The study of such models requires the use of ingredients coming from both convex geometry and probability theory. In the last decades, the study has been focused on their asymptotic properties and in particular expectation and variance estimates. In several joint works with Tomasz Schreiber and J. E. Yukich, we have investigated the scaling limit of several models (uniform model in the unit-ball, uniform model in a smooth convex body, Gaussian model) and have deduced from it limiting variances for several geometric characteristics including the number of k-dimensional faces and the volume. In this paper, we survey the most recent advances on these questions and we emphasize the particular cases of random polytopes in the unit-ball and Gaussian polytopes.
Asymptotic properties of random polytopes
Optimal mass transport over bridges Michele Pavon, Tryphon Georgiou, Yonxin Chen GSI2015
Détails de l'article
We present an overview of our recent work on implementable solutions to the Schrödinger bridge problem and their potential application to optimal transport and various generalizations.
Optimal mass transport over bridges
Affine-invariant Riemannian Distance Between Infinite-dimensional Covariance Operators Minh Ha Quang GSI2015
Détails de l'article
This paper studies the affine-invariant Riemannian distance on the Riemann-Hilbert manifold of positive definite operators on a separable Hilbert space. This is the generalization of the Riemannian manifold of symmetric, positive definite matrices to the infinite-dimensional setting. In particular, in the case of covariance operators in a Reproducing Kernel Hilbert Space (RKHS), we provide a closed form solution, expressed via the corresponding Gram matrices.
Affine-invariant Riemannian Distance Between Infinite-dimensional Covariance Operators
Nonlinear operators on graphs via stacks Jesús Angulo, Santiago Velasco-Forero GSI2015
Détails de l'article
We consider a framework for nonlinear operators on functions evaluated on graphs via stacks of level sets. We investigate a family of transformations on functions evaluated on graph which includes adaptive flat and non-flat erosions and dilations in the sense of mathematical morphology. Additionally, the connection to mean motion curvature on graphs is noted. Proposed operators are illustrated in the cases of functions on graphs, textured meshes and graphs of images.
Nonlinear operators on graphs via stacks
Characterization and Estimation of the Variations of a Random Convex Set by it's Mean $n$-Variogram : Application to the Boolean Model Jean-Charles Pinoli, Johan Debayle, Saïd Rahmani GSI2015
Détails de l'article
In this paper we propose a method to characterize and estimate the variations of a random convex set Ξ0 in terms of shape, size and direction. The mean n-variogram γ(n)Ξ0:(u1⋯un)↦E[νd(Ξ0∩(Ξ0−u1)⋯∩(Ξ0−un))] of a random convex set Ξ0 on ℝ d reveals information on the n th order structure of Ξ0. Especially we will show that considering the mean n-variograms of the dilated random sets Ξ0 ⊕ rK by an homothetic convex family rKr > 0, it’s possible to estimate some characteristic of the n th order structure of Ξ0. If we make a judicious choice of K, it provides relevant measures of Ξ0. Fortunately the germ-grain model is stable by convex dilatations, furthermore the mean n-variogram of the primary grain is estimable in several type of stationary germ-grain models by the so called n-points probability function. Here we will only focus on the Boolean model, in the planar case we will show how to estimate the n th order structure of the random vector composed by the mixed volumes t (A(Ξ0),W(Ξ0,K)) of the primary grain, and we will describe a procedure to do it from a realization of the Boolean model in a bounded window. We will prove that this knowledge for all convex body K is sufficient to fully characterize the so called difference body of the grain Ξ0⊕˘Ξ0. we will be discussing the choice of the element K, by choosing a ball, the mixed volumes coincide with the Minkowski’s functional of Ξ0 therefore we obtain the moments of the random vector composed of the area and perimeter t (A(Ξ0),U(Ξ)). By choosing a segment oriented by θ we obtain estimates for the moments of the random vector composed by the area and the Ferret’s diameter in the direction θ, t((A(Ξ0),HΞ0(θ)). Finally, we will evaluate the performance of the method on a Boolean model with rectangular grain for the estimation of the second order moments of the random vectors t (A(Ξ0),U(Ξ0)) and t((A(Ξ0),HΞ0(θ)).
Characterization and Estimation of the Variations of a Random Convex Set by it's Mean $n$-Variogram : Application to the Boolean Model
Statistical Gaussian Model of Image Regions in Stochastic Watershed Segmentation Jesús Angulo GSI2015
Détails de l'article
Stochastic watershed is an image segmentation technique based on mathematical morphology which produces a probability density function of image contours. Estimated probabilities depend mainly on local distances between pixels. This paper introduces a variant of stochastic watershed where the probabilities of contours are computed from a gaussian model of image regions. In this framework, the basic ingredient is the distance between pairs of regions, hence a distance between normal distributions. Hence several alternatives of statistical distances for normal distributions are compared, namely Bhattacharyya distance, Hellinger metric distance and Wasserstein metric distance.
Statistical Gaussian Model of Image Regions in Stochastic Watershed Segmentation
Fitting Smooth Paths on Riemannian Manifolds - Endometrial Surface Antoine Arnould, Chafik Samir, Michel Canis, Pierre-Antoine Absil, Pierre-Yves Gousenbourger GSI2015
Détails de l'article
We present a new method to fit smooth paths to a given set of points on Riemannian manifolds using C1 piecewise-Bézier functions. A property of the method is that, when the manifold reduces to a Euclidean space, the control points minimize the mean square acceleration of the path. As an application, we focus on data observations that evolve on certain nonlinear manifolds of importance in medical imaging: the shape manifold for endometrial surface reconstruction; the special orthogonal group SO(3) and the special Euclidean group SE(3) for preoperative MRI-based navigation. Results on real data show that our method succeeds in meeting the clinical goal: combining different modalities to improve the localization of the endometrial lesions.
Fitting Smooth Paths on Riemannian Manifolds - Endometrial Surface
Pontryagin calculus in Riemannian geometry Danielle Fortune, Francois Dubois, Juan Antonio Rojas Quintero GSI2015
Détails de l'article
In this contribution, we study systems with a finite number of degrees of freedom as in robotics. A key idea is to consider the mass tensor associated to the kinetic energy as a metric in a Riemannian configuration space. We apply Pontryagin’s framework to derive an optimal evolution of the control forces and torques applied to the mechanical system. This equation under covariant form uses explicitly the Riemann curvature tensor.
Pontryagin calculus in Riemannian geometry
Online k-MLE for mixture modeling with exponential families Christophe Saint-Jean, Frank Nielsen GSI2015
Détails de l'article
This paper address the problem of online learning finite statistical mixtures of exponential families. A short review of the Expectation-Maximization (EM) algorithm and its online extensions is done. From these extensions and the description of the k-Maximum Likelihood Estimator (k-MLE), three online extensions are proposed for this latter. To illustrate them, we consider the case of mixtures of Wishart distributions by giving details and providing some experiments.
Online k-MLE for mixture modeling with exponential families
Multivariate L-moments based on transports Alexis Decurninge GSI2015
Détails de l'article
Univariate L-moments are expressed as projections of the quantile function onto an orthogonal basis of univariate polynomials. We present multivariate versions of L-moments expressed as collections of orthogonal projections of a multivariate quantile function on a basis of multivariate polynomials. We propose to consider quantile functions defined as transports from the uniform distribution on [0; 1] d onto the distribution of interest and present some properties of the subsequent L-moments. The properties of estimated L-moments are illustrated for heavy-tailed distributions.
Multivariate L-moments based on transports
Reparameterization invariant metric on the space of curves Alice Le Brigant, Frédéric Barbaresco, Marc Arnaudon GSI2015
Détails de l'article
This paper focuses on the study of open curves in a manifold M, and its aim is to define a reparameterization invariant distance on the space of such paths. We use the square root velocity function (SRVF) introduced by Srivastava et al. in [11] to define a reparameterization invariant metric on the space of immersions =Imm([0,1],M) by pullback of a metric on the tangent bundle T derived from the Sasaki metric. We observe that such a natural choice of Riemannian metric on T induces a first-order Sobolev metric on with an extra term involving the origins, and leads to a distance which takes into account the distance between the origins and the distance between the image curves by the SRVF parallel transported to a same vector space, with an added curvature term. This provides a generalized theoretical SRV framework for curves lying in a general manifold M.
Reparameterization invariant metric on the space of curves
Random Pairwise Gossip on CAT(k) Metric Spaces Anass Bellachehab, Jérémie Jakubowicz GSI2015
Détails de l'article
In the context of sensor networks, gossip algorithms are a popular, well established technique, for achieving consensus when sensor data are encoded in linear spaces. Gossip algorithms also have several extensions to non linear data spaces. Most of these extensions deal with Riemannian manifolds and use Riemannian gradient descent. This paper, instead, studies gossip in a broader CAT(k) metric setting, encompassing, but not restricted to, several interesting cases of Riemannian manifolds. As it turns out, convergence can be guaranteed as soon as the data lie in a small enough ball of a mere CAT(k) metric space. We also study convergence speed in this setting and establish linear rates of convergence.
Random Pairwise Gossip on CAT(k) Metric Spaces
Multivariate divergences with application in multisample density ratio models Amor Keziou GSI2015
Détails de l'article
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit

We introduce what we will call multivariate divergences between K, K ≥ 1, signed finite measures (Q1, . . . , Q K ) and a given reference probability measure P on a σ-field (X,B), extending the well known divergences between two measures, a signed finite measure Q1 and a given probability distribution P. We investigate the Fenchel duality theory for the introduced multivariate divergences viewed as convex functionals on well chosen topological vector spaces of signed finite measures. We obtain new dual representations of these criteria, which we will use to define new family of estimates and test statistics with multiple samples under multiple semiparametric density ratio models. This family contains the estimate and test statistic obtained through empirical likelihood. Moreover, the present approach allows obtaining the asymptotic properties of the estimates and test statistics both under the model and under misspecification. This leads to accurate approximations of the power function for any used criterion, including the empirical likelihood one, which is of its own interest. Moreover, the proposed multivariate divergences can be used, in the context of multiple samples in density ratio models, to define new criteria for model selection and multi-group classification.
Multivariate divergences with application in multisample density ratio models
Kernel Density Estimation on Symmetric Spaces Dena Asta GSI2015
Détails de l'article
We introduce a novel kernel density estimator for a large class of symmetric spaces and prove a minimax rate of convergence as fast as the minimax rate on Euclidean space. We prove a minimax rate of convergence proven without any compactness assumptions on the space or Hölder-class assumptions on the densities. A main tool used in proving the convergence rate is the Helgason-Fourier transform, a generalization of the Fourier transform for semisimple Lie groups modulo maximal compact subgroups. This paper obtains a simplified formula in the special case when the symmetric space is the 2-dimensional hyperboloid.
Kernel Density Estimation on Symmetric Spaces
The Pontryagin Forms of Hessian Manifolds John Armstrong, Shun-Ichi Amari GSI2015
Détails de l'article
We show that Hessian manifolds of dimensions 4 and above must have vanishing Pontryagin forms. This gives a topological obstruction to the existence of Hessian metrics. We find an additional explicit curvature identity for Hessian 4-manifolds. By contrast, we show that all analytic Riemannian 2-manifolds are Hessian.
The Pontryagin Forms of Hessian Manifolds
From Euclidean to Riemannian Means Information Geometry for SSVEP Classification Emmanuel Kalunga, Eric Monacelli, Karim Djouani, Quentin Barthélemy, Sylvain Chevallier, Yskandar Hamam GSI2015
Détails de l'article
Brain Computer Interfaces (BCI) based on electroencephalography (EEG) rely on multichannel brain signal processing. Most of the state-of-the-art approaches deal with covariance matrices, and indeed Riemannian geometry has provided a substantial framework for developing new algorithms. Most notably, a straightforward algorithm such as Minimum Distance to Mean yields competitive results when applied with a Riemannian distance. This applicative contribution aims at assessing the impact of several distances on real EEG dataset, as the invariances embedded in those distances have an influence on the classification accuracy. Euclidean and Riemannian distances and means are compared both in term of quality of results and of computational load.
From Euclidean to Riemannian Means Information Geometry for SSVEP Classification
A generalization of independence and multivariate Student's t-distributions Hiroshi Matsuzoe, Monta Sakamoto GSI2015
Détails de l'article
In anomalous statistical physics, deformed algebraic structures are important objects. Heavily tailed probability distributions, such as Student’s t-distributions, are characterized by deformed algebras. In addition, deformed algebras cause deformations of expectations and independences of random variables. Hence, a generalization of independence for multivariate Student’s t-distribution is studied in this paper. Even if two random variables which follow to univariate Student’s t-distributions are independent, the joint probability distribution of these two distributions is not a bivariate Student’s t-distribution. It is shown that a bivariate Student’s t-distribution is obtained from two univariate Student’s t-distributions under q-deformed independence.
A generalization of independence and multivariate Student's t-distributions
A sub-Riemannian modular approach for diffeomorphic deformations Alain Trouvé, Barbara Gris, Stanley Durrleman GSI2015
Détails de l'article
We develop a generic framework to build large deformations from a combination of base modules. These modules constitute a dynamical dictionary to describe transformations. The method, built on a coherent sub-Riemannian framework, defines a metric on modular deformations and characterises optimal deformations as geodesics for this metric. We will present a generic way to build local affine transformations as deformation modules, and display examples.
A sub-Riemannian modular approach for diffeomorphic deformations
Quantization of hyperspectral image manifold using probabilistic distances Gianni Franchi, Jesús Angulo GSI2015
Détails de l'article
A technique of spatial-spectral quantization of hyperspectral images is introduced. Thus a quantized hyperspectral image is just summarized by K spectra which represent the spatial and spectral structures of the image. The proposed technique is based on α-connected components on a region adjacency graph. The main ingredient is a dissimilarity metric. In order to choose the metric that best fit the hyperspectral data manifold, a comparison of different probabilistic dissimilarity measures is achieved.
Quantization of hyperspectral image manifold using probabilistic distances
PDE Constrained Shape Optimization as Optimization on Shape Manifolds Kathrin Welker, Martin Siebenborn, Volker Schulz GSI2015
Détails de l'article
The novel Riemannian view on shape optimization introduced in [14] is extended to a Lagrange–Newton as well as a quasi–Newton approach for PDE constrained shape optimization problems.
PDE Constrained Shape Optimization as Optimization on Shape Manifolds
Rolling Symmetric Spaces Fátima Leite, Krzysztof Krakowski, Luís Machado GSI2015
Détails de l'article
Riemannian symmetric spaces play an important role in many areas that are interrelated to information geometry. For instance, in image processing one of the most elementary tasks is image interpolation. Since a set of images may be represented by a point in the Graßmann manifold, image interpolation can be formulated as an interpolation problem on that symmetric space. It turns out that rolling motions, subject to nonholonomic constraints of no-slip and no-twist, provide efficient algorithms to generate interpolating curves on certain Riemannian manifolds, in particular on symmetric spaces. The main goal of this paper is to study rolling motions on symmetric spaces. It is shown that the natural decomposition of the Lie algebra associated to a symmetric space provides the structure of the kinematic equations that describe the rolling motion of that space upon its affine tangent space at a point. This generalizes what can be observed in all the particular cases that are known to the authors. Some of these cases illustrate the general results.
Rolling Symmetric Spaces
Second-order Optimization over the Multivariate Gaussian Distribution Giovanni Pistone, Luigi Malagò GSI2015
Détails de l'article
We discuss the optimization of the stochastic relaxation of a real-valued function, i.e., we introduce a new search space given by a statistical model and we optimize the expected value of the original function with respect to a distribution in the model. From the point of view of Information Geometry, statistical models are Riemannian manifolds of distributions endowed with the Fisher information metric, thus the stochastic relaxation can be seen as a continuous optimization problem defined over a differentiable manifold. In this paper we explore the second-order geometry of the exponential family, with applications to the multivariate Gaussian distributions, to generalize second-order optimization methods. Besides the Riemannian Hessian, we introduce the exponential and the mixture Hessians, which come from the dually flat structure of an exponential family. This allows us to obtain different Taylor formulæ according to the choice of the Hessian and of the geodesic used, and thus different approaches to the design of second-order methods, such as the Newton method.
Second-order Optimization over the Multivariate Gaussian Distribution
Invariant geometric structures on statistical models Hong Van Le, Jürgen Jost, Lorenz Schwachhöfer, Nihat Ay GSI2015
Détails de l'article
We review the notion of parametrized measure models and tensor fields on them, which encompasses all statistical models considered by Chentsov [6], Amari [3] and Pistone-Sempi [10]. We give a complete description of n-tensor fields that are invariant under sufficient statistics. In the cases n = 2 and n = 3, the only such tensors are the Fisher metric and the Amari-Chentsov tensor. While this has been shown by Chentsov [7] and Campbell [5] in the case of finite measure spaces, our approach allows to generalize these results to the cases of infinite sample spaces and arbitrary n. Furthermore, we give a generalisation of the monotonicity theorem and discuss its consequences.
Invariant geometric structures on statistical models
Generalized Mutual-Information based independence tests Amor Keziou, Philippe Regnault GSI2015
Détails de l'article
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit

We derive independence tests by means of dependence measures thresholding in a semiparametric context. Precisely, estimates of mutual information associated to ϕ-divergences are derived through the dual representations of ϕ-divergences. The asymptotic properties of the estimates are established, including consistency, asymptotic distribution and large deviations principle. The related tests of independence are compared through their relative asymptotic Bahadur efficiency and numerical simulations.
Generalized Mutual-Information based independence tests
Asymmetric Topologies on Statistical Manifolds Roman Belavkin GSI2015
Détails de l'article
Asymmetric information distances are used to define asymmetric norms and quasimetrics on the statistical manifold and its dual space of random variables. Quasimetric topology, generated by the Kullback-Leibler (KL) divergence, is considered as the main example, and some of its topological properties are investigated.
Asymmetric Topologies on Statistical Manifolds
Matrix realization of a homogeneous cone Hideyuki Ishi GSI2015
Détails de l'article
Based on the theory of compact normal left-symmetric algebra (clan), we realize every homogeneous cone as a set of positive definite real symmetric matrices, where homogeneous Hessian metrics as well as a transitive group action on the cone are described efficiently.
Matrix realization of a homogeneous cone
Group Theoretical Study on Geodesics for the Elliptical Models Hiroto Inoue GSI2015
Détails de l'article
We consider the geodesic equation on the elliptical model, which is a generalization of the normal model. More precisely, we characterize this manifold from the group theoretical view point and formulate Eriksen’s procedure to obtain geodesics on normal model and give an alternative proof for it.
Group Theoretical Study on Geodesics for the Elliptical Models
Enlargement, geodesics, and collectives Eric Justh, P. S. Krishnaprasad GSI2015
Détails de l'article
We investigate optimal control of systems of particles on matrix Lie groups coupled through graphs of interaction, and characterize the limit of strong coupling. Following Brockett, we use an enlargement approach to obtain a convenient form of the optimal controls. In the setting of drift-free particle dynamics, the coupling terms in the cost functionals lead to a novel class of problems in subriemannian geometry of product Lie groups.
Enlargement, geodesics, and collectives
The information geometry of mirror descent Garvesh Raskutti, Sayan Mukherjee GSI2015
Détails de l'article
We prove the equivalence of two online learning algorithms, mirror descent and natural gradient descent. Both mirror descent and natural gradient descent are generalizations of online gradient descent when the parameter of interest lies on a non-Euclidean manifold. Natural gradient descent selects the steepest descent direction along a Riemannian manifold by multiplying the standard gradient by the inverse of the metric tensor. Mirror descent induces non-Euclidean structure by solving iterative optimization problems using different proximity functions. In this paper, we prove that mirror descent induced by a Bregman divergence proximity functions is equivalent to the natural gradient descent algorithm on the Riemannian manifold in the dual coordinate system.We use techniques from convex analysis and connections between Riemannian manifolds, Bregman divergences and convexity to prove this result. This equivalence between natural gradient descent and mirror descent, implies that (1) mirror descent is the steepest descent direction along the Riemannian manifold corresponding to the choice of Bregman divergence and (2) mirror descent with log-likelihood loss applied to parameter estimation in exponential families asymptotically achieves the classical Cramér-Rao lower bound.
The information geometry of mirror descent