Sélectionnez d'abord la publication

(2017) GSI2017

A symplectic minimum variational principle for dissipative dynamical systems (slides) GSI2017
Prevalence and recoverability of syntactic parameters in sparse distributed memories GSI2017
A symplectic minimum variational principle for dissipative dynamical systems Abdelbacet Oueslati, An Danh Nguyen, Géry de Saxcé GSI2017
Détails de l'article
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit

Using the concept of symplectic subdi erential, we propose a modi cation of the Hamiltonian formalism which can be used for dissipative systems. The formalism is rst illustrated through an application of the standard inelasticity in small strains. Some hints concerning possible extensions to non-standard plasticity and nite strains are then given.
Finally, we show also how the dissipative transition between macrostates can be viewed as an optimal transportation problem.
A symplectic minimum variational principle for dissipative dynamical systems
Prevalence and recoverability of syntactic parameters in sparse distributed memories Alex Mun, Andrew Zhao, Jeong Joon Park, Kevin Yuh, Matilde Marcolli, Ronnel Boettcher, Vibhor Kumar GSI2017
Détails de l'article
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit

We propose a new method, based on sparse distributed memory, for studying dependence relations between syntactic parameters in the Principles and Parameters model of Syntax. By storing data of syntactic structures of world languages in a Kanerva network and checking recoverability of corrupted data from the network, we identify two different e ects: an overall underlying relation between the prevalence of parameters across languages and their degree of recoverability, and a ner e ect that makes some parameters more easily recoverable beyond what their prevalence would indicate. The latter can be seen as an indication of the existence of dependence relations, through which a given parameter can be determined using the remaining uncorrupted data.
Prevalence and recoverability of syntactic parameters in sparse distributed memories
Co-occurrence matrix of covariance matrices: a novel coding model for the classification of texture images Ioana Ilea, Lionel Bombrun, Salem Said, Yannick Berthoumieu GSI2017
Détails de l'article
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit

This paper introduces a novel local model for the classification of covariance matrices: the co-occurrence matrix of covariance matrices. Contrary to state-of-the-art models (BoRW, R-VLAD and RFV), this local model exploits the spatial distribution of the patches. Starting from the generative mixture model of Riemannian Gaussian distributions, we introduce this local model. An experiment on texture image classification is then conducted on the VisTex and Outex_TC000_13 databases to evaluate its potential.
Co-occurrence matrix of covariance matrices: a novel coding model for the classification of texture images
Co-occurrence matrix of covariance matrices: a novel coding model for the classification of texture images (slides) GSI2017
Geometry of the visuo-vestibular information Daniel Bennequin GSI2017
Dirac structures in nonequilbrium thermodynamics François Gay-Balmaz, Hiroaki Yoshimura GSI2017
Détails de l'article
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit

In this paper, we show that the evolution equations for nonequilibrium thermodynamics can be formulated in terms of Dirac structures on the Pontryagin bundle P = TQ  T*Q, where Q = Q x R denotes the thermodynamic con guration manifold. In particular, we extend the use of Dirac structures from the case of linear nonholonomic constraints to the case of nonlinear nonholonomic constraints. Such a nonlinear constraint comes from the entropy production associated with irreversible processes in nonequilibrium thermodynamics. We also develop the induced Dirac structure on N = T*Q x R and the associated Lagrange-Dirac and Hamilton-Dirac dynamical formulations.
Dirac structures in nonequilbrium thermodynamics
3D insights to some divergences for robust statistics and machine learning Birgit Roensch, Wolfgang Stummer GSI2017
Détails de l'article
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit

Divergences (distances) which measure the similarity respectively proximity between two probability distributions have turned out to be very useful for several di erent tasks in statistics, machine learning, information theory, etc. Some prominent examples are the Kullback-Leibler information, - for convex functions Φ - the Csiszar-Ali-Silvey  Φ - divergences CASD, the \classical" (i.e., unscaled) Bregman distances and the more general scaled Bregman distances SBD of [26],[27]. By means of 3D plots we show several properties and pitfalls of the geometries of SBDs, also for non-probability distributions; robustness of corresponding minimum-distance concepts will also be covered. For these investigations, we construct a special SBD subclass which covers both the often used power divergences (of CASD type) as well as their robustness-enhanced extensions with non-convex non-concave  Φ.
3D insights to some divergences for robust statistics and machine learning
3D insights to some divergences for robust statistics and machine learning (flgures) GSI2017
Bregman divergences from comparative convexity Frank Nielsen, Richard Nock GSI2017
Détails de l'article
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit

Comparative convexity is a generalization of ordinary convexity based on abstract means instead of arithmetic means. We define and study the Bregman divergences with respect to comparative convexity. As an example, we consider the convexity induced by quasiarithmetic means, report explicit formulas, and show that those Bregman
divergences are equivalent to conformal ordinary Bregman divergences on monotone embeddings.
Bregman divergences from comparative convexity
Automatic differentiation of non-holonomic fast marching for computing most threatening trajectories under sensors surveillance (slides) GSI2017
The functor of Amari and Riemannian dynamics Ahmed Zeglaoui, Michel Nguiffo Boyom GSI2017
Détails de l'article
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit

Let E be a smooth vector bundle over a manifold M. Let ℝ the trivial bundle M × ℝ ⟶ M. The gauge group of E is denoted by G (E) . The Lie algebra of G (E) is denoted by ℊ(E). A gauge structure in E is a pair (E, ∇), where ∇ is a connection in E. A metric structure in E is a vector bundle homomorphism g∶ E x E ⟶ ℝ. A connection ∇ is called a metric connection in (E,g) if ∇g = 0.
Our propose is to discuss the question whether a giving connection is a metric connection. We use two approaches for answering this question. The first approach is based on the functor of Amari in E. This approach yells an numerical invariant Sb(∇). Another approach involves the group of isomorphism of the set of gauge structure generated by the set of metrics in E. We us this second approach for introducing a new numerical invariant index(∇). We show that both  Sb (∇) and index(∇) are characteristic obstruction to ∇ being a metric connection.
Loosely speaking, the following claims are equivalent: (1) The holonomy group of ∇ is an orthogonal subgroup, (2) Sb(∇) = 0, (3) index(∇)=0.
The functor of Amari and Riemannian dynamics
Semi-Discrete Optimal Transport in Patch Space for Enriching Gaussian Textures Arthur Leclaire, Bruno Galerne, Julien Rabin GSI2017
Détails de l'article
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit

A bilevel texture model is proposed, based on a local transform of a Gaussian random field. The core of this method relies on the optimal transport of a continuous Gaussian distribution towards the discrete exemplar patch distribution. The synthesis then simply consists in a fast post-processing of a Gaussian texture sample, boiling down to an improved nearest-neighbor patch matching, while offering theoretical guarantees on statistical compliancy.
Semi-Discrete Optimal Transport in Patch Space for Enriching Gaussian Textures
Semi-Discrete Optimal Transport in Patch Space for Enriching Gaussian Textures (slides) GSI2017
Information Geometry of Predictor Functions in a Regression Model (slides) GSI2017
Information Geometry of Predictor Functions in a Regression Model Katsuhiro Omae, Shinto Eguchi GSI2017
Détails de l'article
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit

We discuss an information-geometric framework for a regression model, in which the regression function accompanies with the predictor function and the conditional density function. We introduce the e-geodesic and m-geodesic on the space of all predictor functions, of which the pair leads to the Pythagorean identity for a right triangle spinned by the two geodesics. Further, a statistical modeling to combine predictor functions in a nonlinear fashion is discussed by generalized average, and in particular, we observe the exible property of the log-exp average.
Information Geometry of Predictor Functions in a Regression Model
A sequential structure of statistical manifolds on deformed exponential family (slides) GSI2017
A sequential structure of statistical manifolds on deformed exponential family Antonio M. Scarfone, Hiroshi Matsuzoe, Tatsuaki Wada GSI2017
Détails de l'article
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit

Heavily tailed probability distributions are important objects in anomalous statistical physics. For such probability distributions, expectations do not exist in general. Therefore, an escort distribution and an escort expectation have been introduced. In this paper, by generalizing such escort distributions, a sequence of escort distributions is introduced. For a deformed exponential family, we study the fundamental properties of statistical manifold structures derived from the sequence of escort expectations.
A sequential structure of statistical manifolds on deformed exponential family
Some new exibilizations of Bregman divergences and their asymptotics Anna-Lena Kißlinger, Wolfgang Stummer GSI2017
Détails de l'article
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit

Ordinary Bregman divergences (distances) OBD are widely used in statistics, machine learning, and information theory (see e.g. [18], [5]; [7], [14], [4], [23], [6], [16], [22], [25], [15] ). They can be exibilized in
various di erent ways. For instance, there are the Scaled Bregman divergences SBD of Stummer [20] and Stummer & Vajda [21] which contain both the OBDs as well the Csiszar-Ali-Slivey Φ-divergences as special
cases. On the other hand, the OBDs are subsumed by the Total Bregman divergences of Liu et al. [12],[13], Vemuri et al. [24] and the more general Conformal Divergences COD of Nock et al. [17]. The latter authors also
indicated the possibility to combine the concepts of SBD and COD, under the name \Conformal Scaled Bregman divergences" CSBD. In this paper, we introduce some new divergences between (non-)probability distributions
which particularly cover the corresponding OBD, SBD, COD and CSBD (for separable situations) as special cases. Non-convex generators are employed, too. Moreover, for the case of i.i.d. sampling we derive the asymptotics of a useful new-divergence-based test statistics.
Some new exibilizations of Bregman divergences and their asymptotics
Sample-limited Lp Barycentric Subspace Analysis on Constant Curvature Spaces Xavier Pennec GSI2017
Détails de l'article
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit

Generalizing Principal Component Analysis (PCA) to manifolds is pivotal for many statistical applications on geometric data. We rely in this paper on barycentric subspaces, implicitly de ned as the locus of points which are weighted means of k + 1 reference points [8, 9].
Barycentric subspaces can naturally be nested and allow the construction of inductive forward or backward nested subspaces approximating data points. We can also consider the whole hierarchy of embedded barycentric subspaces de ned by an ordered series of points in the manifold (a flag of affine spans): optimizing the accumulated unexplained variance (AUV) over all the subspaces actually generalizes PCA to non Euclidean spaces, a procedure named Barycentric Subspaces Analysis (BSA).
In this paper, we rst investigate sample-limited inference algorithms where the optimization is limited to the actual data points: this transforms a general optimization into a simple enumeration problem. Second, we propose to robustify the criterion by considering the unexplained p-variance of the residuals instead of the classical 2-variance. This construction is very natural with barycentric subspaces since the affine span is stable under the choice of the value of p. The proposed algorithms are illustrated on examples in constant curvature spaces: optimizing the (accumulated) unexplained p-variance (Lp PBS and BSA) for 0 < p ≤ 1 can identify reference points in clusters of a few points within a large number of random points in spheres and hyperbolic spaces.
Sample-limited Lp Barycentric Subspace Analysis on Constant Curvature Spaces
Automatic differentiation of non-holonomic fast marching for computing most threatening trajectories under sensors surveillance Jean-Marie Mirebeau, Johann Dreo GSI2017
Détails de l'article
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit

We consider a two player game, where a rst player has to install a surveillance system within an admissible region. The second player needs to enter the monitored area, visit a target region, and then leave the area, while minimizing his overall probability of detection. Both players know the target region, and the second player knows the surveillance installation details. Optimal trajectories for the second player are computed using a recently developed variant of the fast marching algorithm, which takes into account curvature constraints modeling the second player vehicle maneuverability. The surveillance system optimization leverages a reverse-mode semi-automatic di erentiation procedure, estimating the gradient of the value function related to the sensor location in time O(N lnN).
Automatic differentiation of non-holonomic fast marching for computing most threatening trajectories under sensors surveillance
Diffeomorphic random sampling using optimal information transport Klas Modin, Martin Bauer, Sarang Joshi GSI2017
Détails de l'article
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit

In this article we explore an algorithm for diffeomorphic random sampling of nonuniform probability distributions on Riemannian manifolds. The algorithm is based on optimal information transport (OIT)|an analogue of optimal mass transport (OMT). Our framework uses the deep geometric connections between the Fisher-Rao metric on the space of probability densities and the right-invariant information metric on the group of di eomorphisms. The resulting sampling algorithm is a promising alternative to OMT, in particular as our formulation is semi-explicit, free of the nonlinear Monge{Ampere equation.
Compared to Markov Chain Monte Carlo methods, we expect our algorithm to stand up well when a large number of samples from a low dimensional nonuniform distribution is needed.
Diffeomorphic random sampling using optimal information transport
Newton's Equation on Diffeomorphisms and Densities Boris Khesin, Gerard Misio lek, Klas Modin GSI2017
Détails de l'article
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit

We develop a geometric framework for Newton-type equations on the infinite dimensional configuration space of probability densities. It can be viewed as a second order analogue of the "Otto calculus" framework for gradient  flow equations. Namely, for an n-dimensional manifold M we derive Newton's equations on the group of diffeomorphisms Di (M) and the space of smooth probability densities Dens(M), as well as describe the Hamiltonian reduction relating them. For example, the compressible Euler equations are obtained by a Poisson reduction of Newton's equation on Di (M) with the symmetry group of volume-preserving diffeomorphisms, while the Hamilton-Jacobi equation of  fluid mechanics corresponds to potential solutions. We also prove that the Madelung transform between Schrödinger-type and Newton's equations is a symplectomorphism between the corresponding phase spaces T*Dens(M) and PL2(M,C). This improves on the previous symplectic submersion result of von Renesse [1]. Furthermore, we prove that the Madelung transform is a Kahler map provided that the space of densities is equipped with the (prolonged) Fisher-Rao information metric and describe its dynamical applications. This geometric setting for the Madelung transform sheds light on the relation between the classical Fisher-Rao metric and its quantum counterpart, the Bures metric. In addition to compressible Euler, Hamilton-Jacobi, and linear and nonlinear Schrödinger equations, the framework for Newton equations encapsulates Burgers' inviscid equation, shallow water equations, two-component and µ-Hunter-Saxton equations, the Klein-Gordon equation, and infinite-dimensional Neumann problems.
Newton's Equation on Diffeomorphisms and Densities
Nonlocal Inpainting of Manifold-valued Data on Finite Weighted Graphs Daniel Tenbrinck, Ronny Bergmann GSI2017
Détails de l'article
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit

Recently, there has been a strong ambition to translate models and algorithms from traditional image processing to non-Euclidean domains, e.g., to manifold-valued data. While the task of denoising has been extensively studied in the last years, there was rarely an attempt to perform image inpainting on manifold-valued data. In this paper we present a nonlocal inpainting method for manifold-valued data given on a nite weighted graph. We introduce a new graph in nity-Laplace operator based on the idea of discrete minimizing Lipschitz extensions, which we use to formulate the inpainting problem as PDE on the graph. 
Furthermore, we derive an explicit numerical solving scheme, which we evaluate on two classes of synthetic manifold-valued images.
Nonlocal Inpainting of Manifold-valued Data on Finite Weighted Graphs
Maximum likelihood estimators on manifolds Hatem Hajri, Salem Said, Yannick Berthoumieu GSI2017
Détails de l'article
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit

Maximum likelihood estimator (MLE) is a well known estimator in statistics. The popularity of this estimator stems from its asymptotic and universal properties. While asymptotic properties of MLEs on Euclidean spaces attracted a lot of interest, their studies on manifolds are still insufficient. The present paper aims to give a uni ed study of the subject. Its contributions are twofold. First it proposes a framework of asymptotic results for MLEs on manifolds: consistency, asymptotic normality and asymptotic efficiency. Second, it extends popular testing problems on manifolds. Some examples are discussed.
Maximum likelihood estimators on manifolds
Nonlocal Inpainting of Manifold-valued Data on Finite Weighted Graphs (slides) GSI2017
Sigma Point Kalman Filtering on Matrix Lie Groups Applied to the SLAM Problem David Evan Zlotnik, James Richard Forbes GSI2017
Détails de l'article
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit

This paper considers sigma point Kalman ltering on matrix Lie groups. Sigma points that are elements of a matrix Lie group are generated using the matrix exponential. Computing the mean and covariance using the sigma points via weighted averaging and effective use of the matrix natural logarithm, respectively, is discussed. The specific details of estimating landmark locations, and the position and attitude of a vehicle relative to the estimated landmark locations, is considered.
Sigma Point Kalman Filtering on Matrix Lie Groups Applied to the SLAM Problem