Page 172 - Probability, Random Variables and Random Processes
P. 172

RANDOM  PROCESSES                            [CHAP  5



           are independent. If  {X(t), t 2 0) has independent increments and X(t) - X(s) has the same distribu-
           tion  as X(t + h) - X(s + h) for all s, t, h 2 0, s < t, then  the process X(t) is said to have stationary
           independent increments.
               Let  {X(t), t 2 0) be  a random process with stationary independent increments and assume that
           X(0) = 0. Then (Probs. 5.21 and 5.22)


           where p, = E[X(l)]  and


           where a12 = Var[X(l)].
               From Eq. (5.24), we see that processes with stationary independent increments are nonstationary.
           Examples of  processes  with  stationary independent  increments are  Poisson  processes  and  Wiener
           processes, which are discussed in later sections.


         E.  Markov Processes:

              A random process (X(t), t E 7') said to be a Markov process if
                                         is

           whenever t1 < t2 <   < t, < t,,,.
               A  discrete-state  Markov  process  is  called  a  Markov  chain. For  a  discrete-parameter  Markov
           chain {X,, n 2 0) (see Sec. 5.5), we have for every n
                          P(X,+,  = j(X, = i,,  Xi = i,, ..., Xn = i) = P(Xn+, = jlXn = i)   (5.27)
           Equation  (5.26) or  Eq.  (5.27) is  referred  to  as  the  Markov  property  (which is  also  known  as  the
           memoryless property). This property  of  a Markov process states that  the future state of  the process
           depends only on the present state and not on the past history. Clearly, any process with independent
           increments is a Markov process.
               Using  the  Markov  property,  the  nth-order  distribution  of  a  Markov  process  X(t)  can  be
           expressed as (Prob. 5.25)


                                                         p{X(tk) 2 ~k) I  X(tk - 1) = xk  - I)   (5.28)

           Thus, all finite-order distributions of a Markov process can be expressed in terms of the second-order
           distributions.



         F.  Normal Processes :
               A  random process  {X(t), t E T) is said to be  a normal (or gaussian) process if  for any integer n
           and any subset  (t,, . . ., t,)  of  T, the n r.v.'s  X(tl), ..., X(t,)  are jointly  normally distributed  in  the
           sense that their joint characteristic function is given by






           where  w,,  . .., on are  any  real  numbers  (see  Probs.  5.59  and  5.60).  Equation  (5.29) shows that  a
           normal  process  is  completely characterized  by  the  second-order  distributions.  Thus,  if  a  normal
           process is wide-sense stationary, then it is also strictly stationary.
   167   168   169   170   171   172   173   174   175   176   177