Creative Commons Attribution-ShareAlike 4.0 International
This œuvre, Second-order Optimization over the Multivariate Gaussian Distribution, by Luigi Malagò is licensed under a Creative Commons Attribution-ShareAlike 4.0 International license.

Second-order Optimization over the Multivariate Gaussian Distribution


Second-order Optimization over the Multivariate Gaussian Distribution
Publication details: 
We discuss the optimization of the stochastic relaxation of a real-valued function, i.e., we introduce a new search space given by a statistical model and we optimize the expected value of the original function with respect to a distribution in the model. From the point of view of Information Geometry, statistical models are Riemannian manifolds of distributions endowed with the Fisher information metric, thus the stochastic relaxation can be seen as a continuous optimization problem defined over a differentiable manifold. In this paper we explore the second-order geometry of the exponential family, with applications to the multivariate Gaussian distributions, to generalize second-order optimization methods. Besides the Riemannian Hessian, we introduce the exponential and the mixture Hessians, which come from the dually flat structure of an exponential family. This allows us to obtain different Taylor formulæ according to the choice of the Hessian and of the geodesic used, and thus different approaches to the design of second-order methods, such as the Newton method.
Source et DOI
Vidéo
Voir la vidéo
Second-order Optimization over the Multivariate Gaussian Distribution
Groupes / audience: