PROBILITY III

___________________________________________________

________________________________________________________________________________

MOMENTS
_____________________________________________________________


April 18, 2026

1 RAW MOMENTS

If X is a random variable with pmf \(p(x)\) or pdf \(f(x)\) then the r-th order raw moment is given by:

\begin {eqnarray*} \mu ‘_r & = & E \left (X^r \right ) \\ & = & \left \{ \begin {matrix} \sum _{x} x.p(x) & ; \textit {if X is discrete}\\ \int _{-\infty }^{+\infty } x.f(x) dx & ; \textit {if X is continuous} \end {matrix} \right . \end {eqnarray*}

provided \(E|X^r| < \infty \). \(\newline \)

2 CENTRAL MOMENTS

If X is a random variable with pmf \(p(x)\) or pdf \(f(x)\) then the r-th order central moment(about \(\mu = E(X)\)) is given by:

\begin {eqnarray*} \mu _r & = & E \left ((X – \mu )^r \right ) \\ & = & \left \{ \begin {matrix} \sum _{x} (x-\mu ).p(x) & ; \textit {if X is discrete}\\ \int _{-\infty }^{+\infty } (x-\mu ).f(x) dx & ; \textit {if X is continuous} \end {matrix} \right . \end {eqnarray*}

provided \(E|(X – \mu )^r| < \infty \). \(\newline \)

3 CENTRAL MOMENTS about a

If X is a random variable with pmf \(p(x)\) or pdf \(f(x)\) then the r-th order central moment about A is given by:

\begin {eqnarray*} \mu _r & = & E \left ((X – A)^r \right ) \\ & = & \left \{ \begin {matrix} \sum _{x} (x-A).p(x) & ; \textit {if X is discrete}\\ \int _{-\infty }^{+\infty } (x-A).f(x) dx & ; \textit {if X is continuous} \end {matrix} \right . \end {eqnarray*}

provided \(E|(X – A^r| < \infty \). \(\newline \)

4 SOME THEOREMS

  1. If \(\mu ‘_r\) exists for a r.v. X then \(\mu ‘_s\) also exists if \(0<s<r\).
    Proof: We know that if \(|X| \geq 1\), then

    \begin {eqnarray*} |X|^s \leq |X|^r \end {eqnarray*}

    This implies \(E|X|^s \leq E|X|^r < \infty \) whenever \(|X| > 1\) Again if \(|X| < 1\), then

    \begin {eqnarray*} |X|^s < 1 \end {eqnarray*}

    which implies \(E|X|^s < \infty \).
    Thus, If \(\mu ‘_r\) exists for a r.v. X then \(\mu ‘_s\) also exists if \(0<s<r\)

  2. If \(\mu _r\) esists for a r.v. X then \(\mu _s\) also esists if \(0<s<r\).
    Proof: Same as above taking \(X – \mu \) as above.
  3. If \(\mu ‘_r(a)\) exists for a r.v. then \(\mu _r\) also exists.

5 Relation between Central Moments and Raw Moments

We know that,

\begin {eqnarray*} \mu _r &=& E(X – \mu )^r \\ &=& E \left ( \sum _{i = 1}^{r} (-1)^{r-i} \left ( \begin {matrix} r \\ i \end {matrix} \right ) X^i . \mu ^{r – i} \right ) \\ & = & \sum _{i = 1}^{r} (-1)^{r-i} \left ( \begin {matrix} r \\ i \end {matrix} \right ) \mu ‘_i . \mu ^{r – i} \end {eqnarray*}

Also as.

\begin {eqnarray*} \mu ‘_r &=& E(X^r) \\ &=& E((X – \mu ) + \mu )^r \\ &=& E \left ( \sum _{i = 1}^{r} \left ( \begin {matrix} r \\ i \end {matrix} \right ) (X – \mu )^i . \mu ^{r – i} \right ) \\ & = & \sum _{i = 1}^{r} \left ( \begin {matrix} r \\ i \end {matrix} \right ) \mu _i . \mu ^{r – i} \end {eqnarray*}

6 SOME INEQUALITIES

  1. If X,Y are two random variables defined on the same \((S,\mathcal {A},P)\) and they have finite variances then,

    \begin {eqnarray*} E(X^2).E(Y^2) \geq \left ( E(X.Y) \right )^2 \end {eqnarray*}

    Proof: We know that, \(|XY| \leq (X^2 +Y^2)/2 < \infty \) since the variances exist. Thus \(E|XY| < \) also exists.
    If we consider taking any real number t such that,

    \begin {eqnarray*} h(t) = E(tX – Y)^2 \end {eqnarray*}

    Then clearly \(h(t)\) is real and \(\geq 0\).
    Now consider the equation, 

    \begin {eqnarray*} h(t) = t^2E(X^2) -2tE(XY) +E(Y^2) =0 \end {eqnarray*}

    WLG,

    \begin {eqnarray*} t = \frac { + 2E(XY) \pm \sqrt {4E(XY)^2 – 4*E(X^2).E(Y^2) }}{2E(X^2)} \end {eqnarray*}

    Since, for a quadratic to be nonnegative for all real t, it must have no real roots or a repeated real root, we should have, \(4E(XY)^2 – 4*E(X^2).E(Y^2) \leq 0 \). Thus, \(E(XY)^2 \leq E(X^2).E(Y^2)\)

  2. If both \(\mu ‘_{2r}\) and \(\mu ‘_{2s}\) exists then,

    \begin {eqnarray*} \mu ‘_{2r}. \mu ‘_{2s} \geq \left ( \mu ‘_{r+s} \right )^2 \\ \mu _{2r}. \mu _{2s} \geq \left ( \mu _{r+s} \right )^2 \\ \beta = \frac {\mu _4}{\mu _2^2} \leq 1, \text { provided $\mu _2$ >0} \end {eqnarray*}
  3. For any random variable X,

    \begin {eqnarray*} \left | \begin {matrix} \mu _{2a} & \mu _{a+b} & \mu _{a+c} \\ \mu _{a+b} & \mu _{2b} & \mu _{b+c} \\ \mu _{a+c} & \mu _{b+c} & \mu _{2c} \end {matrix} \right | \geq 0 \end {eqnarray*}
  4. For any random variable X,

    \begin {eqnarray*} \left | \begin {matrix} \mu _{2} & \mu _{1} & \mu _{2} \\ \mu _{1} & \mu _{2} & \mu _{3} \\ \mu _{2} & \mu _{3} & \mu _{4} \end {matrix} \right | \geq 0 \end {eqnarray*}

7 FACTORIAL MOMENTS

If X is a discrete random variable with pmf \(Pr[X = x] = p(x); x = 1,2,3,\dots \), then the r-th factorial moment of the random variable X is given by:

\begin {eqnarray*} \mu _{(r)} &=& E\left ( X(X-1)(\dots (X-r+1)) \right ) \\ &=& \sum _{x = r}^{\infty } x(x-1)\dots (x-r+1) p(x) \end {eqnarray*}

It is to be noted that, \begin {align*} \mu _{(1)} &= \mu ‘_1 \\ \mu _{(2)} &= \mu’_2 – \mu ‘_1 \\ \mu _{(3)} &= \mu ‘_3 – \mu ‘_2 + 2 \mu ‘_1\\ \mu _{(4)} &= \mu ‘_4 -6\mu ‘_3 + 11\mu ‘_2 – 6\mu ‘_1 \end {align*}

and \begin {align*} \mu ‘_1 &= \mu _{(1)} \\ \mu ‘_2 &= \mu _{(2)} + \mu _{(1)}\\ \mu ‘_3 &= \mu _{(3)} + 3\mu _{(2)} + \mu _{(1)}\\ \mu ‘_4 &= \mu _{(4)} +6\mu _{(3)} + 7 \mu _{(2)} +\mu _{(1)} \end {align*}