Page 70 - Computational Statistics Handbook with MATLAB
P. 70

56                         Computational Statistics Handbook with MATLAB


                                mu = mean(x);
                                % Find the numerator and denominator for gamma_1.
                                num = (1/n)*sum((x-mu).^3);
                                den = (1/n)*sum((x-mu).^2);
                                gam1 = num/den^(3/2);
                             This results in a coefficient of skewness of gam1 = -0.0542, which is not
                             too far from zero. Now we find the kurtosis using the following MATLAB
                             commands:
                                % Find the kurtosis.
                                num = (1/n)*sum((x-mu).^4);
                                den = (1/n)*sum((x-mu).^2);
                                gam2 = num/den^2;
                             This gives a kurtosis of gam2 = 1.8766, which is not close to 3, as expected.


                              We note that these statistics might not be the best to use in terms of bias (see
                             Section 3.4). However, they will prove to be useful as examples in Chapters 6
                             and 7, where we look at bootstrap methods for estimating the bias in a statis-
                             tic. The MATLAB Statistics Toolbox function called skewness returns the
                             coefficient of skewness for a random sample. The function kurtosis calcu-
                             lates the sample coefficient of kurtosis (not the coefficient of excess kurtosis).




                             Cova  ar aarr ianian  ccee
                             Cov
                             CovCov
                                 rianian
                                  c
                                     cee
                             In the definitions given below (Equations 3.9 and 3.10), we assume that all
                             expectations exist. The covariance of two random variables X and Y, with
                             joint probability density function f xy,(  )  , is defined as
                                               (
                                                  ,
                                            Cov X Y) =  σ XY,  =  EX –([  µ X ) Y –(  µ Y )]  .  (3.9)
                             The correlation coefficient of X and Y is given by
                                                                     ,
                                                                   (
                                                               Cov X Y)     σ XY,
                                                  ,
                                                (
                                            Corr X Y) =  ρ XY,  =  -------------------------- =  -------------  ,  (3.10)
                                                                  σ σ Y    σ σ Y
                                                                             X
                                                                   X
                             where σ X >  0  and σ Y >  . 0
                              The correlation is a measure of the linear relationship between two random
                             variables. If the joint distribution of two variables has a correlation coeffi-
                             cient, then  1 ≤–  ρ XY, ≤  1  . When  ρ XY,  =  1  , then X and Y are perfectly posi-
                             tively correlated. This means that the possible values for X and Y lie on a line
                             with positive slope. On the other hand, when  ρ XY,  =  – 1  , then the situation
                             is the opposite: X and Y are perfectly negatively correlated. If X and Y are



                            © 2002 by Chapman & Hall/CRC
   65   66   67   68   69   70   71   72   73   74   75