Page 217 - A First Course In Stochastic Models
P. 217

210                    MARKOV CHAINS AND QUEUES

                Proof  For fixed  , x > 0, let U  ,x be a Poisson distributed random variable with
                                                 (x/ ) k
                                             −x/
                             P {U  ,x = k } = e        ,  k = 0, 1, . . . .
                                                   k!
                                                         2
                It is immediately verified that E(U  ,x ) = x and σ (U  ,x ) = x . Let g(t) be any
                bounded function. We now prove that
                                        lim E[g(U  ,x )] = g(x)              (5.5.2)
                                        →0
                for each continuity point x of g(t). To see this, fix ε > 0 and a continuity point x
                of g(t). Then there exists a number δ > 0 such that |g(t) − g(x)| ≤ ε/2 for all t
                with |t − x| ≤ δ. Also, let M > 0 be such that |g(t)| ≤ M/2 for all t. Then

                                             ∞

                         |E[g(U  ,x )] − g(x)| ≤  |g(k ) − g(x)| P {U  ,x = k }
                                             k=0
                                             ε
                                           ≤   + M         P {U  ,x = k }
                                             2
                                                   k:|k −x|>δ
                                             ε
                                           =   + MP {|U  ,x − E(U  ,x )| > δ}.
                                             2
                                                                      2
                By Chebyshev’s inequality, P {|U  ,x − E(U  ,x )| > δ} ≤ x /δ . For   small
                                         1
                                     2
                enough, we have Mx /δ ≤ ε. This proves the relation (5.5.2). Next, we apply
                                         2
                (5.5.2) with g(t) = F(t). Hence, for any continuity point x of F(t),
                                                     ∞                  k
                                                               −x/   (x/ )
                         F(x) = lim E[F(U  ,x )] = lim  F(k )e
                                 →0               →0                 k!
                                                     k=0
                                     ∞              k
                                        −x/   (x/ )  k
                              = lim    e              p j ( ),
                                 →0            k!
                                    k=0            j=1
                where the latter equality uses that F(0) = 0. Interchanging the order of summation,
                we next obtain
                                           ∞        ∞            k
                                                       −x/   (x/ )
                                F(x) = lim    p j ( )  e          ,
                                        →0                    k!
                                           j=1     k=j
                yielding the desired result.
                  The proof of Theorem 5.5.1 shows that the result also holds when F(t) has a
                positive mass at t = 0. We should then add the term F(0) to the right-hand side
                of (5.5.1). Roughly stated, Theorem 5.5.1 tells us that the probability distribution
                of any positive random variable can be arbitrarily closely approximated by a mix-
                ture of Erlangian distributions with the same scale parameters. The fact that the
                Erlangian distributions have identical scale parameters simplifies the construction
   212   213   214   215   216   217   218   219   220   221   222