Page 168 - DMTH404_STATISTICS
P. 168

Statistics



                      Notes



                                       Notes    that the above integral is finite for t  [–h, h] for any 0 < h < , so that X possesses
                                       a moment generating function.


                                                                             
                                                                     M (t) =
                                                                       X       t

                                    12.2 Deriving Moments with the mgf

                                    The moment generating function takes its name by the fact that it can be used to derive the
                                    moments of X, as stated in the following proposition.
                                    Proposition If a random variable X possesses a moment generating function M (t), then, for any
                                                                                                   X
                                    n , the n-th moment of X (denote it by  (n)) exists and is finite.
                                                                       X
                                    Furthermore:

                                                                             n
                                                                            d Mx(t)
                                                                 (n) = E[X ] =
                                                                         n
                                                                 X              n
                                                                              dt
                                                                                    
                                                                                   t 0
                                           n
                                          d Mx(t)
                                    where           is the nth derivative of M (t) with respect to t, evaluated at the point t = 0.
                                            dt  n                       X
                                                  
                                                 t 0
                                    Providing the above proposition is quite complicated, because a lot of analytical details must be
                                    taken  care  of  (see e.g.  Pfeiffer,  P.E.  (1978)  concepts of  probability  theory,  Courier  Dover
                                    Publications). The intuition, however, is straightforward: since the expected value is a linear
                                    operator  and  differentiation  is a  linear  operation,  under  appropriate  conditions  one  can
                                    differentiate through the expected value, as follows:
                                                     n
                                                    d M (t)    d n  E[exp(tX)] E   d n  exp(tX)   E[X exp(tX)]
                                                                                    
                                                                                          n
                                                       X
                                                                        
                                                     dt n   dt n            dt  n   
                                    which, evaluated at the point t = 0, yields
                                                           n
                                                         d M (t)     E[X exp(0.X)] E[X ]    (n)
                                                             X
                                                                       n
                                                                                    n
                                                                                
                                                           dt  n                        X
                                                                t 0
                                                                 
                                           Example: Continuing  the example  above,  the  moment  generating  function  of  an
                                    exponential random variable is
                                                                             
                                                                     MX(t) =
                                                                               t
                                    The expected value of X can be computed by taking the first derivative of the moment generating
                                    function.
                                                                    dM (t)    l
                                                                       X
                                                                      dt    (   t) 2

                                    and evaluating it at r = 0.



            160                              LOVELY PROFESSIONAL UNIVERSITY
   163   164   165   166   167   168   169   170   171   172   173