Page 236 - Schaum's Outlines - Probability, Random Variables And Random Processes
P. 236
CHAP. 63 ANALYSIS AND PROCESSING OF RANDOM PROCESSES
(i) If X(t) and Y(t) are jointly WSS, then we have
RzW = RXW + RXYW + R,x(z) + RY(4
where z = s - t. Taking the Fourier transform of the above expression, we obtain
Sz(4 = Sxf4 + SXY(4 + SYX(4 + SY(W
(ii) If X(t) and Y(t) are orthogonal [Eq. (6.21)],
RXY(4 = R,,(z) = 0
Then Rz(d = Rx~) + R&)
SZ(4 = SAo) + SY(4
(b) Setting z = 0 in Eq. (6.1 34a), and using Eq. (6.15), we get
E[Z2(t)] = E[X2(t)] + E[ Y2(t)]
which indicates that the mean square of Z(t) is equal to the sum of the mean squares of X(t) and Y(t).
WHITE NOISE
6.20. Using the notion of generalized derivative, show that the generalized derivative X'(t) of the
Wiener process X(t) is a white noise.
From Eq. (5.64),
Rx(t, s) = a2 min(t, s)
and from Eq. (6.1 19) (Prob. 6.9), we have
Now, using the 6 function, the generalized derivative of a unit step function u(t) is given by
Applying the above relation to Eq. (6.135), we obtain
which is, by Eq. (6.116) (Prob. 6.7), the autocorrelation function of the generalized derivative X'(t) of the
Wiener process X(t); that is,
where z = t -- s. Thus, by definition (6.43), we see that the generalized derivative X1(t) of the Wiener process
X(t) is a white noise.
Recall that the Wiener process is a normal process and its derivative is also normal (see Prob. 6.10).
Hence, the generalized derivative X'(t) of the Wiener process is called white normal (or white gaussian) noise.
6.21. Let X(t) be a Poisson process with rate 1. Let
Y(t) = X(t) - At
Show that the generalized derivative Y'(t) of Y(t) is a white noise.
Since Y(t) = X(t) - At, we have formally
r(t) = xl(t) - n