Maximum likelihood estimation of Riemannian metrics from Euclidean data

07/11/2017
Publication GSI2017
OAI : oai:www.see.asso.fr:17410:22573
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit
 

Résumé

Euclidean data often exhibit a nonlinear behavior, which may be modeled by assuming the data is distributed near a nonlinear submanifold in the data space. One approach to find such a manifold is to estimate a Riemannian metric that locally models the given data.
Data distributions with respect to this metric will then tend to follow the nonlinear structure of the data. In practice, the learned metric rely on parameters that are hand-tuned for a given task. We propose to estimate
such parameters by maximizing the data likelihood under the assumed distribution. This is complicated by two issues: (1) a change of parameters imply a change of measure such that different likelihoods are incomparable; (2) some choice of parameters renders the numerical calculation of distances and geodesics unstable such that likelihoods cannot be evaluated. As a practical solution, we propose to (1) re-normalize likelihoods with respect to the usual Lebesgue measure of the data space, and (2) to bound the likelihood when its exact value is unattainable. We provide practical algorithms for these ideas and illustrate their use on synthetic data, images of digits and faces, as well as signals extracted from EEG scalp measurements.

Maximum likelihood estimation of Riemannian metrics from Euclidean data

Collection

application/pdf Maximum likelihood estimation of Riemannian metrics from Euclidean data Georgios Arvanitidis, Lars Kai Hansen, Søren Hauberg
Détails de l'article
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit

Maximum likelihood estimation of Riemannian metrics from Euclidean data
application/pdf Maximum likelihood estimation of Riemannian metrics from Euclidean data (slides)

Média

Voir la vidéo

Métriques

0
0
1.08 Mo
 application/pdf
bitcache://c3b4ecd3f34b17a220996b1f6b53192fc5c6c657

Licence

Creative Commons Aucune (Tous droits réservés)

Sponsors

Sponsors Platine

alanturinginstitutelogo.png
logothales.jpg

Sponsors Bronze

logo_enac-bleuok.jpg
imag150x185_couleur_rvb.jpg

Sponsors scientifique

logo_smf_cmjn.gif

Sponsors

smai.png
logo_gdr-mia.png
gdr_geosto_logo.png
gdr-isis.png
logo-minesparistech.jpg
logo_x.jpeg
springer-logo.png
logo-psl.png

Organisateurs

logo_see.gif
<resource  xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
                xmlns="http://datacite.org/schema/kernel-4"
                xsi:schemaLocation="http://datacite.org/schema/kernel-4 http://schema.datacite.org/meta/kernel-4/metadata.xsd">
        <identifier identifierType="DOI">10.23723/17410/22573</identifier><creators><creator><creatorName>Georgios Arvanitidis</creatorName></creator><creator><creatorName>Lars Kai Hansen</creatorName></creator><creator><creatorName>Søren Hauberg</creatorName></creator></creators><titles>
            <title>Maximum likelihood estimation of Riemannian metrics from Euclidean data</title></titles>
        <publisher>SEE</publisher>
        <publicationYear>2018</publicationYear>
        <resourceType resourceTypeGeneral="Text">Text</resourceType><subjects><subject>Statistics on Manifolds</subject><subject>manifold learning</subject><subject>metric learning</subject></subjects><dates>
	    <date dateType="Created">Thu 8 Mar 2018</date>
	    <date dateType="Updated">Thu 8 Mar 2018</date>
            <date dateType="Submitted">Mon 15 Oct 2018</date>
	</dates>
        <alternateIdentifiers>
	    <alternateIdentifier alternateIdentifierType="bitstream">c3b4ecd3f34b17a220996b1f6b53192fc5c6c657</alternateIdentifier>
	</alternateIdentifiers>
        <formats>
	    <format>application/pdf</format>
	</formats>
	<version>37294</version>
        <descriptions>
            <description descriptionType="Abstract">Euclidean data often exhibit a nonlinear behavior, which may be modeled by assuming the data is distributed near a nonlinear submanifold in the data space. One approach to find such a manifold is to estimate a Riemannian metric that locally models the given data.<br />
Data distributions with respect to this metric will then tend to follow the nonlinear structure of the data. In practice, the learned metric rely on parameters that are hand-tuned for a given task. We propose to estimate<br />
such parameters by maximizing the data likelihood under the assumed distribution. This is complicated by two issues: (1) a change of parameters imply a change of measure such that different likelihoods are incomparable; (2) some choice of parameters renders the numerical calculation of distances and geodesics unstable such that likelihoods cannot be evaluated. As a practical solution, we propose to (1) re-normalize likelihoods with respect to the usual Lebesgue measure of the data space, and (2) to bound the likelihood when its exact value is unattainable. We provide practical algorithms for these ideas and illustrate their use on synthetic data, images of digits and faces, as well as signals extracted from EEG scalp measurements.
</description>
        </descriptions>
    </resource>
.