Page 170 - Probability, Random Variables and Random Processes
P. 170

162                             RANDOM  PROCESSES                            [CHAP  5



           F,(xl;  t,) is known as the first-order distribution of  X(t). Similarly, given t,  and t,,  X(t,) = X,  and
           X(t2)  = X,  represent two r.v.3. Their joint  distribution  is  known as the second-order distribution of
           X(t) and is given by


           In general, we define the nth-order distribution of X(t) by
                              FX(x1,  . . . , x,;  ti, . . . , t,)  = P{X(tl) I xl, . . . , X(t,)  2 x,)
           If X(t) is a discrete-time process, then X(t) is specified by a collection of pmf s:
                              px(xl, . . . , Xn ; tl, . . . , t,)  = P{X(tl) = XI, . . . , X(t,)  = x,}   (5.4)
           If X(t) is a continuous-time process, then X(t) is specified by a collection of pdf s:




           The complete characterization  of  X(t) requires knowledge of  all the distributions  as n + co. Fortu-
           nately, often much less is sufficient.

         B.  Mean, Correlation, and Covariance Functions:
               As in the case of r.v.3, random processes are often described by using statistical averages.
               The mean of X(t) is defined by



           where X(t) is treated as a random variable for a fixed value of  t. In general, p,(t)  is a function of time,
           and it is often called the ensemble average of X(t). A measure of dependence among the r.v.'s of X(t) is
           provided by its autocorrelation function, defined by
                                            Rx(t, s) = ECX(t)X(s)l                         (5.7)
           Note that
                                              RX@, s) = Rx(s, t)                           (5.8)
           and                               Rx(~, t) = ECX2(t)l                           (5.9)
           The autocovariance function of X(t) is defined by
                            KXV,  s) = CovCX(t), X(s)l = E{CX(t) - px(t)lCX(s) - Px(s)l>
                                                  = Rx(t, s) - Px(t)Px(s)                 (5.1  0)
           It is clear that if  the mean  of  X(t) is zero, then Kx(t, s) = Rx(t, s). Note  that  the variance of  X(t) is
           given by


               If X(t) is a complex random process, then its autocorrelation function Rx(t, s) and autocovariance
           function Kx(t, s) are defined, respectively, by








         5.4  CLASSIFICATION OF  RANDOM  PROCESSES
               If  a  random  process X(t) possesses some special probabilistic  structure, we  can  specify less  to
           characterize X(t) completely. Some simple random processes are characterized completely by only the
           first- and second-order distributions.
   165   166   167   168   169   170   171   172   173   174   175