Statistical Manifolds Admitting Torsion, Pre-contrast Functions and Estimating Functions

07/11/2017
Auteurs : Masayuki Henmi
Publication GSI2017
OAI : oai:www.see.asso.fr:17410:22638
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit
 

Résumé

It is well-known that a contrast function defined on a product manifold M x M induces a Riemannian metric and a pair of dual torsionfree affine connections on the manifold M. This geometrical structure is called a statistical manifold and plays a central role in information geometry. Recently, the notion of pre-contrast function has been introduced and shown to induce a similar differential geometrical structure on M, but one of the two dual affine connections is not necessarily torsion-free. This structure is called a statistical manifold admitting torsion. This paper summarizes such previous results including the fact that an estimating function on a parametric statistical model naturally defines a pre-contrast function to induce a statistical manifold admitting torsion and provides some new insights on this geometrical structure. That is, we show that the canonical pre-contrast function can be defined on a partially flat space, which is a flat manifold with respect to only one of the dual connections, and discuss a generalized projection theorem in terms of the canonical pre-contrast function.

Statistical Manifolds Admitting Torsion, Pre-contrast Functions and Estimating Functions

Collection

application/pdf Statistical Manifolds Admitting Torsion, Pre-contrast Functions and Estimating Functions Masayuki Henmi
Détails de l'article
contenu protégé  Document accessible sous conditions - vous devez vous connecter ou vous enregistrer pour accéder à ou acquérir ce document.
- Accès libre pour les ayants-droit

It is well-known that a contrast function defined on a product manifold M x M induces a Riemannian metric and a pair of dual torsionfree affine connections on the manifold M. This geometrical structure is called a statistical manifold and plays a central role in information geometry. Recently, the notion of pre-contrast function has been introduced and shown to induce a similar differential geometrical structure on M, but one of the two dual affine connections is not necessarily torsion-free. This structure is called a statistical manifold admitting torsion. This paper summarizes such previous results including the fact that an estimating function on a parametric statistical model naturally defines a pre-contrast function to induce a statistical manifold admitting torsion and provides some new insights on this geometrical structure. That is, we show that the canonical pre-contrast function can be defined on a partially flat space, which is a flat manifold with respect to only one of the dual connections, and discuss a generalized projection theorem in terms of the canonical pre-contrast function.
Statistical Manifolds Admitting Torsion, Pre-contrast Functions and Estimating Functions

Média

Voir la vidéo

Métriques

0
0
100.56 Ko
 application/pdf
bitcache://daf4bded1e05213b382e423b9c81d8c47ed089a2

Licence

Creative Commons Aucune (Tous droits réservés)

Sponsors

Sponsors Platine

alanturinginstitutelogo.png
logothales.jpg

Sponsors Bronze

logo_enac-bleuok.jpg
imag150x185_couleur_rvb.jpg

Sponsors scientifique

logo_smf_cmjn.gif

Sponsors

smai.png
gdrmia_logo.png
gdr_geosto_logo.png
gdr-isis.png
logo-minesparistech.jpg
logo_x.jpeg
springer-logo.png
logo-psl.png

Organisateurs

logo_see.gif
<resource  xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
                xmlns="http://datacite.org/schema/kernel-4"
                xsi:schemaLocation="http://datacite.org/schema/kernel-4 http://schema.datacite.org/meta/kernel-4/metadata.xsd">
        <identifier identifierType="DOI">10.23723/17410/22638</identifier><creators><creator><creatorName>Masayuki Henmi</creatorName></creator></creators><titles>
            <title>Statistical Manifolds Admitting Torsion, Pre-contrast Functions and Estimating Functions</title></titles>
        <publisher>SEE</publisher>
        <publicationYear>2018</publicationYear>
        <resourceType resourceTypeGeneral="Text">Text</resourceType><subjects><subject>Information geometry</subject><subject>statistical manifold</subject><subject>torsion</subject><subject>contrast function</subject><subject>estimating function</subject><subject>standardization</subject><subject>Godambe information matrix</subject></subjects><dates>
	    <date dateType="Created">Fri 9 Mar 2018</date>
	    <date dateType="Updated">Fri 9 Mar 2018</date>
            <date dateType="Submitted">Mon 19 Nov 2018</date>
	</dates>
        <alternateIdentifiers>
	    <alternateIdentifier alternateIdentifierType="bitstream">daf4bded1e05213b382e423b9c81d8c47ed089a2</alternateIdentifier>
	</alternateIdentifiers>
        <formats>
	    <format>application/pdf</format>
	</formats>
	<version>37395</version>
        <descriptions>
            <description descriptionType="Abstract">It is well-known that a contrast function defined on a product manifold M x M induces a Riemannian metric and a pair of dual torsionfree affine connections on the manifold M. This geometrical structure is called a statistical manifold and plays a central role in information geometry. Recently, the notion of pre-contrast function has been introduced and shown to induce a similar differential geometrical structure on M, but one of the two dual affine connections is not necessarily torsion-free. This structure is called a statistical manifold admitting torsion. This paper summarizes such previous results including the fact that an estimating function on a parametric statistical model naturally defines a pre-contrast function to induce a statistical manifold admitting torsion and provides some new insights on this geometrical structure. That is, we show that the canonical pre-contrast function can be defined on a partially flat space, which is a flat manifold with respect to only one of the dual connections, and discuss a generalized projection theorem in terms of the canonical pre-contrast function.
</description>
        </descriptions>
    </resource>
.

Statistical Manifolds Admitting Torsion, Pre-contrast Functions and Estimating Functions Masayuki Henmi ⋆ The Institute of Statistical Mathematics 10-3 Midori-cho, Tachikawa, Tokyo 190-8562, Japan henmi@ism.ac.jp Abstract. It is well-known that a contrast function defined on a product manifold M ×M induces a Riemannian metric and a pair of dual torsion- free affine connections on the manifold M. This geometrical structure is called a statistical manifold and plays a central role in information geom- etry. Recently, the notion of pre-contrast function has been introduced and shown to induce a similar differential geometrical structure on M, but one of the two dual affine connections is not necessarily torsion-free. This structure is called a statistical manifold admitting torsion. This paper summarizes such previous results including the fact that an es- timating function on a parametric statistical model naturally defines a pre-contrast function to induce a statistical manifold admitting torsion and provides some new insights on this geometrical structure. That is, we show that the canonical pre-contrast function can be defined on a partially flat space, which is a flat manifold with respect to only one of the dual connections, and discuss a generalized projection theorem in terms of the canonical pre-contrast function. Keywords: Statistical manifold, torsion, contrast function, estimating function, standardization, Godambe information matrix, information ge- ometry 1 Introduction In information geometry, a central role is played by a statistical manifold, which is a Riemannian manifold with a pair of two dual torsion-free affine connections. This geometrical structure is induced from an asymmetric (squared) distance-like smooth function called a contrast function by taking its second and third deriva- tives [1], [2]. The Kullback-Leibler divergence on a regular parametric statistical model is a typical example of contrast functions and its induced geometrical objects are the Fisher metric, the exponential and mixture connections. The structure determined by these objects play an important role in the geometry of statistical inference, as is widely known [3], [4]. A statistical manifold admitting torsion (SMAT) is a Riemannian manifold with a pair of two dual affine connections, where only one of them must be ⋆ This work was supported by JSPS KAKENHI Grant Number 15K00064. 2 torsion-free but the other is necessarily not. This geometrical structure naturally appears in a quantum statistical model (i.e. a set of density matrices representing quantum states) [3] and the notion of SMAT was originally introduced to study such a geometrical structure from a mathematical point of view [5]. A pre- contrast function was subsequently introduced as a generalization for the first derivative of a contrast function and it was shown that an pre-contrast function induces a SMAT by taking its first and second derivatives [6]. Henmi and Matsuzoe [7] showed that a SMAT also appears in “classical” statistics through an estimating function. More precisely, an estimating function naturally defines a pre-contrast function on a parametric statistical model and a SMAT is induced from it. This paper summarizes such previous results and provides some new insights for this geometrical structure. That is, we show that the canonical pre-contrast function can be defined on a partially flat space, which is a SMAT where only one of its dual connections is flat, and discuss a generalized projection theorem in a partially flat space. This theorem relates orthogonal projection of the geodesic with respect to the flat connection to the canonical pre-contrast function. 2 Statistical Manifolds and Contrast Functions In this paper, we assume that all geometrical objects on differentiable manifolds are smooth and restrict our attention to Riemannian manifolds, although the most of the concepts can be defined for semi-Riemannian manifolds. Let (M, g) be a Riemannian manifold and ∇ be an affine connection on M. The dual connection ∇∗ of ∇ with respect to g is defined by Xg(Y, Z) = g(∇XY, Z) + g(Y, ∇∗ XZ) (∀X, ∀Y, ∀Z ∈ X(M)) where X(M) is the set of all vector fields on M. For a affine connection ∇ on M, its curvature tensor field R and torsion tensor field T are defined by the following equations as usual: R(X, Y )Z := ∇X∇Y Z − ∇Y ∇XZ − ∇[X,Y ]Z (∀X, ∀Y, ∀Z ∈ X(M)), T(X, Y ) := ∇XY − ∇Y X − [X, Y ] (∀X, ∀Y ∈ X(M)). It is said that an affine connection ∇ is torsion-free if T = 0. Note that for a torsion-free affine connection ∇, ∇∗ = ∇ implies that ∇ is the Levi-Civita connection with respect to g. Let R∗ and T∗ be the curvature and torsion tensor fields of ∇∗ , respectively. It is easy to see that R = 0 always implies R∗ = 0, but T = 0 does not necessarily implies T∗ = 0. Let ∇ be a torsion-free affine connection on a Riemannian manifold (M, g). Following [8], we say that (M, g, ∇) is a statistical manifold if and only if ∇g is a symmetric (0, 3)-tensor field, that is (∇Xg)(Y, Z) = (∇Y g)(X, Z) (∀X, ∀Y, ∀Z ∈ X(M)). (1) 3 This condition is equivalent to T∗ = 0 under the condition that ∇ is a torsion- free. If (M, g, ∇) is a statistical manifold, so is (M, g, ∇∗ ) and it is called the dual statistical manifold of (M, g, ∇). Since ∇ and ∇∗ are both torsion-free for a statistical manifold (M, g, ∇), R = 0 implies that ∇ and ∇∗ are both flat. In this case, (M, g, ∇, ∇∗ ) is called a dually flat space. Let ϕ be a real-valued function on the direct product M ×M of a manifold M and X1, ..., Xi, Y1, ..., Yj be vector fields on M. The functions ϕ[X1, ..., Xi|Y1, ..., Yj], ϕ[X1, ..., Xi| ] and ϕ[ |Y1, ..., Yj] on M are defined by the equations ϕ[X1, . . . , Xi|Y1, . . . , Yj](r) := (X1)p · · · (Xi)p(Y1)q · · · (Yj)qϕ(p, q)|p=r,q=r,(2) ϕ[X1, . . . , Xi| ](r) := (X1)p · · · (Xi)pϕ(p, r)|p=r, (3) ϕ[ |Y1, . . . , Yj](r) := (Y1)q · · · (Yj)qϕ(r, q)|q=r (4) for any r ∈ M, respectively [1]. Using these notations, a contrast function ϕ is defined to be a real-valued function which satisfies the following conditions on M [1], [2]: (a) ϕ(p, p) = 0 (∀p ∈ M) (b) ϕ[X| ] = ϕ[ |X] = 0 (∀X ∈ X(M)) (c) g(X, Y ) := −ϕ[X|Y ] (∀X, ∀Y ∈ X(M)) is a Riemannian metric on M. Note that these conditions imply that in some neighborhood of the diagonal set {(r, r)|r ∈ M} in M × M, ϕ(p, q) ≥ 0, ϕ(p, q) = 0 ⇐⇒ p = q. Although a contrast function is not necessarily symmetric, this inequality means that a contrast function measures some discrepancy between two points on M (at least locally). For a given contrast function ϕ, the two affine connections ∇ and ∇∗ are defined by g(∇XY, Z) = −ϕ[XY |Z], g(Y, ∇∗ XZ) = −ϕ[Y |XZ] (∀X, ∀Y, ∀Z ∈ X(M)). In this case, ∇ and ∇∗ are both torsion-free and dual to each other with respect to g, which means that both of (M, g, ∇) and (M, g, ∇∗ ) are statistical manifolds. In particular, (M, g, ∇) is called the statistical manifold induced from the contrast function ϕ. Now we briefly mention a typical example of contrast functions. Let S = {p(x; θ) | θ = (θ1 , ..., θd ) ∈ Θ ⊂ Rd } be a regular parametric statistical model, which is a set of probability density functions with respect to a dominating measure ν on a sample space X. Each element is indexed by a parameter (vector) θ in an open subset Θ of Rd and the set S satisfies some regularity conditions, under which S can be seen as a differentiable manifold. The Kullback-Leibler divergence of the two density functions p1(x) = p(x; θ1) and p2(x) = p(x; θ2) in S is defined to be ϕKL(p1, p2) := ∫ X p2(x) log p2(x) p1(x) ν(dx). 4 It is easy to see that the Kullback-Leibler divergence satisfies the conditions (a), (b) and (c), and so it is a contrast function on S. Its induced Riemannian metric and dual connectins are Fisher metric, the exponential an mixture connections, respectively, and given as follows: gjk(θ) := g(∂j, ∂k) = Eθ{sj (x, θ)sk (x, θ)}, { Γij,k(θ) := g(∇∂i ∂j, ∂k) = Eθ[{∂isj (x, θ)}sk (x, θ)] Γ∗ ik,j(θ) := g(∂j, ∇∗ ∂i ∂k) = ∫ X sj (x, θ)∂i∂kp(x; θ)ν(dx) , where Eθ indicates that the expectation is taken with respect to p(x; θ), ∂i = ∂ ∂θi and si (x; θ) = ∂i log p(x; θ) (i = 1, . . . , d). As is widely known, this geometri- cal structure plays the most fundamental and important role in the differential geometry of statistical inference [3], [4]. 3 Statistical Manifolds Admitting Torsion and Pre-contrast Functions A statistical manifold admitting torsion is an abstract notion for the geometrical structure where only one of the dual connections is allow to have torsion, which naturally appears in a quantum statistical model [3]. The definition is obtained by generalizing (1) in the definition of statistical manifold as follows [5]. Let (M, g) be a Riemannian manifold and ∇ be an affine connection on M. We say that (M, g, ∇) is a statistical manifold admitting torsion (SMAT for short) if and only if (∇Xg)(Y, Z) − (∇Y g)(X, Z) = −g(T(X, Y ), Z) (∀X, ∀Y, ∀Z ∈ X(M)). (5) This condition is equivalent to T∗ = 0 in the case where ∇ possibly has torsion. Note that the condition (5) reduces to (1) if ∇ is torsion-free and that (M, g, ∇∗ ) is not necessarily a statistical manifold although ∇∗ is torsion-free. It should be also noted that (M, g, ∇∗ ) is a SMAT whenever a torsion-free affine connection ∇ is given on a Riemannian manifold (M, g). For a SMAT (M, g, ∇), R = 0 does not necessarily imply that ∇ is flat, but it implies that ∇∗ is flat since R∗ = 0 and T∗ = 0. In this case, we call (M, g, ∇, ∇∗ ) a partially flat space. Let ρ be a real-valued function on the direct product TM × M of a manifold M and its tangent bundle TM, and X1, ..., Xi, Y1, ..., Yj, Z be vector fields on M. The function ρ[X1, ..., XiZ|Y1, ..., Yj] is defined by ρ[X1, . . . , XiZ|Y1, . . . , Yj](r) := (X1)p · · · (Xi)p(Y1)q · · · (Yj)qρ(Zp, q)|p=r,q=r for any r ∈ M. Note that the role of Z is different from vector fields in the notation of (2). The functions ρ[X1, ..., XiZ| ] and ρ[ |Y1, ..., Yj] are also defined in the similar way to (3) and (4). 5 We say that ϕ is a pre-contrast function on M if and only if the following conditions are satisfied [6], [7]: (a) ρ(f1X1 + f2X2, q) = 0 (∀fi ∈ C∞ (M), ∀Xi ∈ X(M) (i = 1, 2), ∀q ∈ M) (b) ρ[X| ] = 0 (∀X ∈ X(M)) i.e. ρ(Xp, p) = 0 (∀p ∈ M) (c) g(X, Y ) := −ρ[X|Y ] (∀X, ∀Y ∈ X(M)) is a Riemannian metric on M. Note that for any contrast function ϕ, the function ρϕ which is defined by ρϕ(Xp, q) := Xpϕ(p, q) (∀p, ∀q ∈ M, ∀Xp ∈ Tp(M)) is a pre-contrast function on M. The notion of pre-contrast function is obtained by taking the fundamental properties of the first derivative of a contrast function as axioms. For a given pre- contrast function, two affine connections ∇ and ∇∗ are defined by the following equations in the same way as a contrast function: g(∇XY, Z) = −ρ[XY |Z], g(Y, ∇∗ XZ) = −ρ[Y |XZ] (∀X, ∀Y, ∀Z ∈ X(M)). In this case, ∇ and ∇∗ are dual to each other with respect to g and ∇∗ is torsion- free. However, the affine connection ∇ possibly has torsion. This means that (M, g, ∇) is a SMAT and it is called the SMAT induced from the pre-contrast function ρ. 4 Generalized Projection Theorem in Partially Flat Spaces In a dually flat space (M, g, ∇, ∇∗ ), it is well-known that the canonical con- trast functions (called ∇- and ∇∗ - divergences) are naturally defined, and the Pythagorean theorem and the projection theorem are stated in terms of the ∇ and ∇∗ geodesics and the canonical contrast functions [3], [4]. In a partially flat space (M, g, ∇, ∇∗ ), where R = R∗ = 0 and T∗ = 0, a pre-contrast function which seems to be canonical can be defined and a projection theorem holds on the “canonical” pre-contrast function and the ∇∗ - geodesic. Proposition 1 (Canonical Pre-contrast Functions). Let (M, g, ∇, ∇∗ ) be a partially flat space (i.e. (M, g, ∇) is a SMAT with R = R∗ = 0 and T∗ = 0) and (U, ηi) be an affine coordinate neighborhood with respect to ∇∗ in M. The function ρ on TU ×U defined by the following equation is a pre-contrast function on U which induces the SMAT (U, g, ∇): ρ(Zp, q) := −gp(Zp, γ̇∗ (0)) (∀p, ∀q ∈ U, ∀Zp ∈ Tp(U)), (6) where γ∗ : [0, 1] → U is the ∇∗ -geodesic such that γ∗ (0) = p, γ∗ (1) = q and γ̇∗ (0) is the tangent vector of γ∗ on p. Proof. For the function ρ defined as (6), the condition (a) in the definition of pre-contrast functions follows from the bilinearity of the inner product gp. The condition (b) immediately follows from γ̇∗ (0) = 0 when p = q. By calculating the derivatives of ρ with the affine coordinate system (ηi), it can be shown that the condition (c) holds and that the induced Riemannian metric and dual affine connections coincide with the original g, ∇ and ∇∗ . ⊓ ⊔ 6 In particular, if (M, g, ∇, ∇∗ ) is a dually flat space, the pre-contrast function ρ defined in (6) coincides with the directional derivative of ∇∗ -divergence ϕ∗ (·, q) with respect to Zp (cf. [9], [10]). Hence, the definition of (6) seems to be natural one and we call the function ρ in (6) the canonical pre-contrast function in a partially flat space (U, g, ∇, ∇∗ ). From the definition of the canonical pre-contrast function, we can immedi- ately obtain the following theorem. Corollary 1 (Generalized Projection Theorem). Let U be an affine co- ordinate neighborhood and ρ be the canonical pre-contrast function defined in Proposition 1. For any submanifold N in U, the following conditions are equiv- alent: (i) The ∇∗ -geodesic starting at q ∈ U is perpendicular to N at p ∈ N (ii) ρ(Zp, q) = 0 for any Zp in Tp(N). In the case where (U, g, ∇, ∇∗ ) is a dually flat space, the projection theorem states that the minimum of the ∇∗ -divergence ϕ∗ (·, q) : N → R should attain at the point p ∈ N where the ∇∗ -geodesic starting at q is perpendicular to N. It immediately follows from the generalized projection theorem, since the directional derivative of ϕ∗ (·, q) is the canonical pre-contrast function. 5 Statistical Manifolds Admitting Torsion Induced from Estimating Functions As we mentioned in Introduction, a SMAT naturally appears through estimating functions in a “classical” statistical model as well as in a quantum statistical model. In this section, we briefly explain how a SMAT is induced on S from an estimating function. See [7] for more details including a concrete example. Let S = {p(x; θ) | θ = (θ1 , ..., θd ) ∈ Θ ⊂ Rd } be a regular parametric statistical model. An estimating function on S, which we consider here, is a Rd -valued function u(x, θ) satisfying the following conditions: Eθ{u(x, θ)} = 0, Eθ{∥u(x, θ)∥2 } < ∞, det [ Eθ { ∂u ∂θ (x, θ) }] ̸= 0 (∀θ ∈ Θ). The first condition is called the unbiasedness of estimating functions, which is important to ensure the consistency of the estimator obtained from an estimat- ing function. Let X1, . . . , Xn be a random sample from an unknown probability distribution p(x; θ0) in S. The estimator θ̂ for θ0, which is obtained as a solu- tion to the estimating equation ∑n i=1 u(Xi, θ) = 0, is called an M-estimator. The M-estimator θ̂ has the consistency θ̂ → θ0 (in probability as n → ∞) and the asymptotic normality √ n(θ̂ − θ0) → N(0, Avar(θ̂)) (in distribution as n → ∞) under some additional regularity conditions [11], where Avar(θ̂) is an asymptotic variance-covariance matrix of θ̂ and is given by Avar(θ̂) = 7 {A(θ0)}−1 B(θ0){A(θ0)}−T with A(θ) := Eθ {(∂u/∂θ)(x, θ)} and B(θ) := Eθ { u(x, θ)u(x, θ)T } . In order to induce the structure of SMAT on S from an estimating function, we consider the notion of standardization of estimating functions. For an estimat- ing function u(x, θ), its standardization (or standardized estimating function) is defined by u∗(x, θ) := Eθ { s(x, θ)u(x, θ)T } [ Eθ { u(x, θ)u(x, θ)T }]−1 u(x, θ), where s(x, θ) = (∂/∂θ) log p(x; θ) is the score function [12]. Geometrically, the ith component of the standardized estimating function u∗(x, θ) is the orthogonal projection of the ith component of the score function s(x, θ) onto the linear space spanned by all components of the estimating function u(x, θ) in the Hilbert space Hθ := {a(x) | Eθ{a(x)} = 0, Eθ{a(x)2 } < ∞} with the inner product < a(x), b(x) >θ:= Eθ{a(x)b(x)} (∀a(x), ∀b(x) ∈ Hθ). In terms of the standardization, the asymptotic variance-covariance matrix can be rewritten as Avar(θ̂) = {G(θ0)}−1 , where G(θ) := Eθ { u∗(x, θ)u∗(x, θ)T } . The matrix G(θ) is called a Godambe information matrix [13], which is a gen- eralization of the Fisher information matrix. As we have seen in Section 2, the Kullback-Leibler divergence ϕKL is a con- trast function on S. Hence, the first derivative of ϕKL is a pre-contrast function on S and given by ρKL((∂j)p1 , p2) := (∂j)p1 ϕKL(p1, p2) = − ∫ X sj (x, θ1)p(x; θ2)ν(dx) for any two probability distributions p1(x) = p(x; θ1), p2(x) = p(x; θ2) in S and j = 1, . . . , d. This observation leads to the following proposition. Proposition 2 (Pre-contrast Functions from Estimating Functions). For an estimating function u(x, θ) on the parametric model S, a pre-contrast function ρu : TS × S → R is defined by ρu((∂j)p1 , p2) := − ∫ X uj ∗(x, θ1)p(x; θ2)ν(dx) for any two probability distributions p1(x) = p(x; θ1), p2(x) = p(x; θ2) in S and j = 1, . . . , d, where uj ∗(x, θ) is the jth component of the standardization u∗(x, θ) of u(x, θ). The use of the standardization u∗(x, θ) instead of u(x, θ) ensures that the def- inition of the function ρu does not depend on the choice of coordinate system (parameter) of S. In fact, for a coordinate transformation (parameter transfor- mation) η = Φ(θ), the estimating function u(x, θ) is changed into v(x, η) = u(x, Φ−1 (η)) and we have v∗(x, η) = (∂θ/∂η) T u∗(x, θ). The proof of Propo- sition 2 is straightforward. In particular, the condition (b) in the definition of 8 pre-contrast function follows from the unbiasedness of the (standardized) esti- mating function. The Riemannian metric g, dual connections ∇ and ∇∗ induced from the pre-contrast function ρu are given as follows: gjk(θ) := g(∂j, ∂k) = Eθ{uj ∗(x, θ)uk ∗(x, θ)} = G(θ)jk, { Γij,k(θ) := g(∇∂i ∂j, ∂k) = Eθ[{∂iuj ∗(x, θ)}sk (x, θ)] Γ∗ ik,j(θ) := g(∂j, ∇∗ ∂i ∂k) = ∫ X uj ∗(x, θ)∂i∂kp(x; θ)ν(dx) , where G(θ)jk is the (j, k) component of the Godambe information matrix G(θ). Note that ∇∗ is always torsion-free since Γ∗ ik,j = Γ∗ ki,j, whereas ∇ is not neces- sarily torsion-free unless u∗(x, θ) is integrable with respect to θ. Henmi and Matsuzoe [7] discussed the quasi score function in [14], which is a well-known example of non-integrable estimating functions. They showed that one of the induced affine connections actually has torsion and the other connec- tion is flat, that is, a partially flat space is induced. The pre-contrast function defined from the estimating function coincides with the canonical pre-contrast function and the generalized projection theorem can be applied. However, its statistical meaning has not been clarified yet. Although it is expected that the SMAT induced from an estimating function has something to do with statisti- cal inference based on the estimating function, the clarification on it is a future problem. References 1. Eguchi, S.: Geometry of minimum contrast. Hiroshima Math. J. 22, 631-647 (1992) 2. Matsuzoe, H.: Geometry of contrast functions and conformal geometry. Hiroshima Math. J. 29, 175-191 (1999). 3. Amari, S., Nagaoka, H.: Method of Information Geometry. Amer. Math. Soc., Prov- idence, Oxford University Press, Oxford (2000) 4. Amari, S.: Information Geometry and Its Applications. Springer (2016) 5. Kurose, T.: Statistical Manifolds Admitting Torsion. Geometry and Something, Fukuoka University (2007) 6. Matsuzoe, H.: Statistical Manifolds Admitting Torsion and Pre-contrast Functions. Information Geometry and Its Related Fields, Osaka City University (2010) 7. Henmi, M., Matsuzoe, H.: Geometry of pre-contrast functions and non-conservative estimating functions. AIP Conference Proceedings 1340, 32-41 (2011) 8. Kurose, T.: On the divergences of 1-conformally flat statistical manifolds. Tohoku Math. J. 46, 427-433 (1994) 9. Henmi, M., Kobayashi, R.: Hooke’s law in statistical manifolds and divergences. Nagoya Math. J. 159, 1-24 (2000) 10. Ay, N. and Amari, S.: A novel approach to canonical divergences within information geometry. Entropy 17, 8111-8129 (2015) 11. van der Vaart, A.W.: Asymptotic Statistics. Cambridge University Press (2000) 12. Heyde, C.C.: Quasi-Likelihood and Its Application. Springer (1997) 13. Godambe, V.: An optimum property of regular maximum likelihood estimation. Ann. Math. Statist. 31, 1208-1211 (1960) 14. McCullagh, P., Nelder J.A.: Generalized Linear Models (2nd ed.). Chapman and Hall (1989)