Generalized EM algorithms for minimum divergence estimation

28/10/2015
Publication GSI2015
OAI : oai:www.see.asso.fr:11784:14344

Résumé

Minimum divergence estimators are derived through the dual form of the divergence in parametric models. These estimators generalize the classical maximum likelihood ones. Models with unobserved data, as mixture models, can be estimated with EM algorithms, which are proved to converge to stationary points of the likelihood function under general assumptions. This paper presents an extension of the EM algorithm based on minimization of the dual approximation of the divergence between the empirical measure and the model using a proximaltype algorithm. The algorithm converges to the stationary points of the empirical criterion under general conditions pertaining to the divergence and the model. Robustness properties of this algorithm are also presented. We provide another proof of convergence of the EM algorithm in a two-component gaussian mixture. Simulations on Gaussian andWeibull mixtures are performed to compare the results with the MLE.

Generalized EM algorithms for minimum divergence estimation

Collection

application/pdf Generalized EM algorithms for minimum divergence estimation Michel Broniatowski, Diaa Al Mohamad
Détails de l'article
Minimum divergence estimators are derived through the dual form of the divergence in parametric models. These estimators generalize the classical maximum likelihood ones. Models with unobserved data, as mixture models, can be estimated with EM algorithms, which are proved to converge to stationary points of the likelihood function under general assumptions. This paper presents an extension of the EM algorithm based on minimization of the dual approximation of the divergence between the empirical measure and the model using a proximaltype algorithm. The algorithm converges to the stationary points of the empirical criterion under general conditions pertaining to the divergence and the model. Robustness properties of this algorithm are also presented. We provide another proof of convergence of the EM algorithm in a two-component gaussian mixture. Simulations on Gaussian andWeibull mixtures are performed to compare the results with the MLE.
Generalized EM algorithms for minimum divergence estimation

Média

Voir la vidéo

Métriques

102
8
571.08 Ko
 application/pdf
bitcache://93c6f0ab4d6749f6faea5699bb623467fe6a4dec

Licence

Creative Commons Attribution-ShareAlike 4.0 International

Sponsors

Organisateurs

logo_see.gif
logocampusparissaclay.png

Sponsors

entropy1-01.png
springer-logo.png
lncs_logo.png
Séminaire Léon Brillouin Logo
logothales.jpg
smai.png
logo_cnrs_2.jpg
gdr-isis.png
gdrmia_logo.png
logo_x.jpeg
logo-lix.png
logorioniledefrance.jpg
isc-pif_logo.png
logo_telecom_paristech.png
csdcunitwinlogo.jpg
<resource  xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
                xmlns="http://datacite.org/schema/kernel-4"
                xsi:schemaLocation="http://datacite.org/schema/kernel-4 http://schema.datacite.org/meta/kernel-4/metadata.xsd">
        <identifier identifierType="DOI">10.23723/11784/14344</identifier><creators><creator><creatorName>Michel Broniatowski</creatorName></creator><creator><creatorName>Diaa Al Mohamad</creatorName></creator></creators><titles>
            <title>Generalized EM algorithms for minimum divergence estimation</title></titles>
        <publisher>SEE</publisher>
        <publicationYear>2015</publicationYear>
        <resourceType resourceTypeGeneral="Text">Text</resourceType><dates>
	    <date dateType="Created">Sun 8 Nov 2015</date>
	    <date dateType="Updated">Wed 31 Aug 2016</date>
            <date dateType="Submitted">Tue 13 Nov 2018</date>
	</dates>
        <alternateIdentifiers>
	    <alternateIdentifier alternateIdentifierType="bitstream">93c6f0ab4d6749f6faea5699bb623467fe6a4dec</alternateIdentifier>
	</alternateIdentifiers>
        <formats>
	    <format>application/pdf</format>
	</formats>
	<version>24738</version>
        <descriptions>
            <description descriptionType="Abstract">
Minimum divergence estimators are derived through the dual form of the divergence in parametric models. These estimators generalize the classical maximum likelihood ones. Models with unobserved data, as mixture models, can be estimated with EM algorithms, which are proved to converge to stationary points of the likelihood function under general assumptions. This paper presents an extension of the EM algorithm based on minimization of the dual approximation of the divergence between the empirical measure and the model using a proximaltype algorithm. The algorithm converges to the stationary points of the empirical criterion under general conditions pertaining to the divergence and the model. Robustness properties of this algorithm are also presented. We provide another proof of convergence of the EM algorithm in a two-component gaussian mixture. Simulations on Gaussian andWeibull mixtures are performed to compare the results with the MLE.

</description>
        </descriptions>
    </resource>
.

Generalized EM Algorithms for Minimum Divergence Estimation Diaa AL MOHAMAD and Michel BRONIATOWSKI LSTA, Université de Pierre et Marie Curie, Paris 6 2nd Conference on Geometry Science of Information (GSI) 30 Oct 2015 The Title Proximal Algorithms Divergences EM Application: Mixture models