码迷,mamicode.com
首页 > 其他好文 > 详细

读书笔记 1 of Statistics :Moments and Moment Generating Functions (c.f. Statistical Inference by George Casella and Roger L. Berger)

时间:2014-11-17 21:02:46      阅读:152      评论:0      收藏:0      [点我收藏+]

标签:style   io   color   ar   os   sp   for   strong   on   

Part 1: Moments

Definition 1 For each integer $n$, the nth moment of $X$, $\mu_n^{‘}$ is

\[\mu_{n}^{‘} = EX^n.\]

The nth central moment of $X$, $\mu_n$, is 

\[ \mu_n = E(X-\mu)^n,\]

where $\mu=\mu_{1}^{‘}=EX$.

 

Definition 2 The variance of a random variable $X$ $= Var X = E(X-EX)^2$.

The standard deviation of $X$ $=  \sqrt{Var X}$.

 

 

Part 2: Moment Generating Function (mgf)

The mgf can be used to generate moments. In practice, it is easier to compute moments directly than to use the mgf. However, the main use of the mgf is to help in characterizing a distribution. 

 

Defintion 3 Let $X$ be a random variable with cdf $F_X$. The moment generating function (mgf) of $X$, denoted by $M_X(t)$, is

\[M_{X}(t) = E e^{tX}, \]

provided that the expectation exists for $t$ in some neighborhood of $0$. If the expectation does not exist in a neighborhood of $0$, we say that the moment generating function does not exist.

 

Theorem 4 (mgf generates moments)

If $X$ has mgf $M_X(t)$, then 

\[E X^{n} = M_{X}^{(n)}(0),\]

where we define 

\[M_{X}^{(n)}(0) = \frac{d^n}{d t^{n}}M_X(t) |_{t=0}.\]

That is, the nth moment is equal to the nth derivative of $M_X(t)$ evaluated at $t=0$. 

 

Remark 5 If the mgf exists, it characterizes an infinite set of moments. However, the infinite set of moments does not uniquely determine a distribution function. If we pose some condition on the random variable, say it has bounded support, then it is true that the inifinite set of moments uniquely determine a distribution function. 

Remark 6 Existence of all moments is not equivalent to existence of the moment generating function. Actually, if the mgf exists in a neighborhood of 0, then the distribution is uniquely determined. An analogue is the analytic function in a neighborhood and the existence of derivatives of all orders. 

 

Theorem 7 

Let $F_X(t)$ and $F_Y(t)$ be two cdfs all of whose moments exist. 

a. If $X$ and $Y$ have bounded support, then $F_X(u)=F_Y(u)$ for all $u$ if and only if $E X^{r} = E Y^{r}$ for all integers $r = 0, 1, 2, \cdots$

b. If the moment generating functions exist and $M_X(t) = M_Y(t)$ for all $t$ in some neighborhood of $0$, then $F_X(u) = F_Y(u)$ for all u. 

 

Theorem 8 

Suppose $\{X_i\}, \quad i=1,2,3,\cdots$ is a sequence of random variables, each with mgf $M_{X_i}(t)$.

Furthermore, suppose that

\[\lim_{i\to \infty}M_{X_{i}}(t) = M_{X}(t), \]

for all $t$ in a neighborhood of 0, and $M_X(t)$  is an mgf.

Then there is a unique cdf $F_X$ whose moments are determined by $M_X(t)$ and , for all $x$ where $F_X(t)$ is continuous, we have

\[\lim_{i\to \infty}F_{X_{i}}(x) = F_{X}(x).\]

That is, convergence, for $|t|<h$, of mgfs to an mgf implies convergence of cdfs. 

 

读书笔记 1 of Statistics :Moments and Moment Generating Functions (c.f. Statistical Inference by George Casella and Roger L. Berger)

标签:style   io   color   ar   os   sp   for   strong   on   

原文地址:http://www.cnblogs.com/zhenan2014/p/4103704.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!