From Almost Gaussian to Gaussian

21/09/2014
Auteurs :
Publication MaxEnt 2014
OAI : oai:www.see.asso.fr:9603:11317
DOI :

Résumé

From Almost Gaussian to Gaussian

Métriques

31
8
250.96 Ko
 application/pdf
bitcache://4184cfd4cf86ac99f8581a4d3543ecc6fc753bfc

Licence

Creative Commons Aucune (Tous droits réservés)

Sponsors

Sponsors scientifique

logo_smf_cmjn.gif
smai.png

Sponsors logistique

logo_cnrs_2.jpg
logo_supelec.png
logo-universite-paris-sud.jpg
logo_see.gif

Sponsors financier

bsu-logo.png
entropy1-01.png
<resource  xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
                xmlns="http://datacite.org/schema/kernel-4"
                xsi:schemaLocation="http://datacite.org/schema/kernel-4 http://schema.datacite.org/meta/kernel-4/metadata.xsd">
        <identifier identifierType="DOI">10.23723/9603/11317</identifier><creators><creator><creatorName>Chandra Nair</creatorName></creator><creator><creatorName>Olivier Rioul</creatorName></creator><creator><creatorName>Max H. M. Costa</creatorName></creator></creators><titles>
            <title>From Almost Gaussian to Gaussian</title></titles>
        <publisher>SEE</publisher>
        <publicationYear>2014</publicationYear>
        <resourceType resourceTypeGeneral="Text">Text</resourceType><dates>
	    <date dateType="Created">Sat 30 Aug 2014</date>
	    <date dateType="Updated">Mon 2 Oct 2017</date>
            <date dateType="Submitted">Tue 22 May 2018</date>
	</dates>
        <alternateIdentifiers>
	    <alternateIdentifier alternateIdentifierType="bitstream">4184cfd4cf86ac99f8581a4d3543ecc6fc753bfc</alternateIdentifier>
	</alternateIdentifiers>
        <formats>
	    <format>application/pdf</format>
	</formats>
	<version>34204</version>
        <descriptions>
            <description descriptionType="Abstract"></description>
        </descriptions>
    </resource>
.

From Almost Gaussian to Gaussian Max H. M. Costa *, Chandra Nair ∆ , Olivier Rioul ‡ *University of Campinas – Unicamp, max@fee.unicamp.br ∆ The Chinese University of Hong Kong, chandra.nair@gmail.com ‡ Télécom ParisTech, olivier.rioul@telecom-paristech.fr Abstract We consider lower and upper bounds of the difference of the differential entropies of a Gaussian random vector and an approximately Gaussian random vector after they are "smoothed" by an arbitrarily distributed random vector of finite power. These bounds are important to establish the optimality of the corner points in the capacity region of Gaussian interference channels. A problematic issue in a previous attempt to establish these bounds was detected in 2004 and the mentioned corner points have since been dubbed "the missing corner points". The importance of the given bounds comes from the fact that they induce Fano-type inequalities for the Gaussian interference channel. Usual Fano inequalities are based on a communication requirement. In this case, the new inequalities are derived from a non-disturbance constraint. The upper bound on the difference of differential entropies is established by the Data Processing Inequality (DPI). For the lower bound, we argue that if follows from the DPI and a continuity argument.