0% found this document useful (0 votes)
211 views13 pages

123

estadistica etn

Uploaded by

Abel Callisaya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
211 views13 pages

123

estadistica etn

Uploaded by

Abel Callisaya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 13
Problems 557 LEMS Seetions 9.1 and 9.2: Definition and Specification of a Stochastic Process 9.1. In Example 9.1, find the joint pmf for X, and X2.Why are X, and X, independent? 9.2. A discrete-time random process X,, is defined as follows. A fair die is tossed and the out- come k is observed. The process is then given by X,, = k for all n. (a) Sketch some sample paths of the process. (b) Find the pmf for X, (© Find the joint pmf for X,, and X;,+,. (@ Find the mean and autocovariance functions of X,.. 9.3. A discrete-time random process X,, is defined as follows. A fair coin is tossed. If the out- come is heads, X,, = (~1)" for all n; if the outcome is tails, X, = (-1)"*" forall n. (a) Sketch some sample paths of the process. (b) Find the pmf for X,,. (©) Find the joint pmf for X,, and Xp, (@) Find the mean and autocovariance functions of X,,. 9.4, A discrete-time random process is defined by X,, = s", for n = 0, where s is selected at random from the interval (0,1). (a) Sketch some sample paths of the process. (b) Find the cdf of X,,. (©) Find the joint cdf for X,, and X,,41. (@ Find the mean and autocovariance functions of X,, (e) Repeat parts a, b,c, and d ifs is uniform in (1,2). 9.5. Let g(¢) be the rectangular pulse shown in Fig. P9.1. The random process X(t) is defined as X(t) = Ast), where A assumes the values +1 with equal probability. 4 0 i FIGURE P9.1 (a) Find the pmf of X(). (b) Find my(2) (©) Find the joint pmf of X(#) and X(t + d). (@) FindCy(t,t + d),d >0. 9.6. A random process is defined by Y(t) = g(t —T), where g(0) is the rectangular pulse of Fig, P9.1, and T is a uniformly distributed random variable in the interval (0, 1). Chapter 9 9.7. 9.8. 9.9. 9.10. 9A. 9.12, 9.13. Random Processes (a) Find the pmfof ¥(0). a (b) Find my(t) and Cy(t,t2). A random process is defined by X(t) = g(t -T), where Tis a uniform random variable in the interval (0, 1) and g(0) is the periodic trian. gular waveform shown in Fig. P9.2. 0 1 2 3 FIGURE P9.2 (a) Find the cdf of X(1) for0 <1 <1. (b) Find my(#) and Cy(t), ). Let Y(t) = g(¢ — T) asin Problem 9.6, but let Tbe an exponentially distributed randot variable with parameter a. (a) Find the pmf of Y(t). : (b) Find the joint pmf of ¥(#) and Y(t + d). Consider two cases: d>tand0 Land m. (©) Show that P[S,, = j1 Ss, = 4,Sn, =k] = P[Sy, = i1Sp, = i], where m > m > ny, 9.23. (a) Find P[S,, = 0] for the random walk process. 0) 9.24. Consider the following moving average processes: What is the answer in part aif p = 1/2? Y, = U(X, + Xpa) Xo = 0 Z,= 3X, +1BX,1 Xy=0 (a) Find the mean, variance, and covariance of Y,, and Z,, if X,, is a Bernoulli random process. (b) Repeat part a if X,, is the random step process. (©) Generate 100 outcomes of a Bernoulli random process X,,, and find the resulting Y, and Z,,. Are the sample means of Y,, and Z,, in part a close to their respective means? (d)_ Repeat part c with X,, given by the random step process. 9.25. Consider the following autoregressive processes: W,=2W,1+X, Wo=0 Zp = BAZ y+ Xp, Z=0. (a) Suppose that X,, is a Bernoulli process. What trends do the processes exhibit? (b) Express W,, and Z,, in terms of X,,, Xy-15.-., 1 and then find Z[W,,] and E[Z,], Do these results agree with the trends you expect? (©) DoW, or Z,, have independent increments? stationary increments? (a) Generate 100 outcomes of a Bernoulli process. Find the resulting realizations of W, and Z,,. Is the sample mean meaningful for either of these processes? (e) Repeat part dif X,, is the random step process. 9.26. Let M,, be the discrete-time process defined as the sequence of sample means of an iid sequence: Xt Xp tet Xq . : M, (a) Find the mean, variance, and covariance of M,. (b) Does M,, have independent increments? stationary increments? 9.27. Find the pdf of the processes defined in Problem 9.24 if the X,, are an iid sequence zero-mean, unit-variance Gaussian random variables. 9.28, Let X,, consist of an iid sequence of Cauchy random variables. (a) Find the pdf of the sum process S,,. Hint: Use the characteristic function method. (b) Find the joint pdf of S, and S,..,. 9.29. Let X,, consist of an iid sequence of Poisson random variables with mean a. (a) Find the pmf of the sum process S,, (b) Find the joint pmf of S, and Sj... 9.30. 9.31. 9.32. 9.33. Problems 561 Let X,, be an iid sequence of zero-mean, unit-variance Gaussian random variables. (a) Find the pdf of M,, defined in Problem 9.26. (b) Find the joint pdf of M,, and M,,.,. Hint; Use the independent increments property of Sy. Repeat Problem 9.26 with X,, = 1/2(¥,, + ¥,-1), where Y,, is an iid random process, What happens to the variance of M,, as increases? Repeat Problem 9.26 with X,, = 3/4X,,_, + Y, where Y,, is an iid random process. What happens to the variance of M,, as n increases? Suppose that an experiment has three possible outcomes, say 0,1, and 2, and suppose that these occur with probabilities py, p;. and p2, respectively. Consider a sequence of inde- pendent repetitions of the experiment, and let X (71) be the indicator function for out- come j. The vector X(n) = (Xo(n), Xi(n), Xam) then constitutes a vector-valued Bernoulli random process. Consider the counting process for X(n): S(n) = X(n) + X(n - 1) +--+ X(1) (a) Show that $(n) has a multinomial distribution, (b) Show that S(n) has independent increments, then find the joint pmf of S(n) and S(n + k). (©) Show that components S,(n) of the vector process are binomial counting processes. Section 9.4: Poisson and Associated Random Processes 9.34. 9.36. 9.37. A server handles queries that arrive according to a Poisson process with a rate of 10 queries per minute. What is the probability that no queries go unanswered if the server is unavailable for 20 seconds? . Customers deposit $1 in a vending machine according to a Poisson process with rate A The machine issues an item with probability p. Find the pmf for the number of items dis- pensed in time t. Noise impulses occur in a radio transmission according to a Poisson process of rate A. (a) Find the probability that no impulses occur during the iransmission of a message that is ¢ seconds long. (b) Suppose that the message is encoded so that the errors caused by up to 2 impulses can be corrected. What is the probability that a +second message cannot be corrected? Packets arrive at a multiplexer at two ports according to independent Poisson processes of rates A; = 1 and Az = 2 packets/second, respectively. (a) Find the probability that a message arrives first on line 2. (b) Find the pdf for the time until a message arrives on either line. (©) Find the pmf for M(0), the total number of messages that arrive in an interval of length t. (d) Generalize the result of part c for the “merging” of k independent Poisson processes of rates Aj,..., Ax, Fespectively: N(t) = Ni(t) +--+ Ni(z). 562 Chapter 9 9.38. 9.39. 9.40. 9.41, 9.42, 9.43. Random Processes (a) Find P[N(t - d) = j|N(0) = k] with d > 0, where N(0) is a Poisson process with rate A. (b) Compare your answer to P[) any. Let N;(¢) be a Poisson process with arrival rate A, that is started at ¢ = 0, Let N2(z) be another Poisson process that is independent of N,(#), that has arrival rate Az, and that ig” started atr = 1. (a) Show that the pmf of the process N(t) = N(t) + N2(¢) is given by: (t + d) = j|N(t) = k]. Explain the difference, jg (on(e +) = mi (t)y* ko PIN(§ +7) - N(t) =k] = emer) fork = 0,1,. where m(t) = E{N(t)]. (b) Now consider a Poisson process in which the arrival rate A(¢) is a piecewise constant function of time. Explain why the pmf of the process is given by the above mk: where m(t) = fw an’ fl (©) For what other arrival functions A(t) does the pmif in part a hold? (a) Suppose that the time required to service a customer in a queueing system is a rai dom variable 7. If customers arrive at the system according to a Poisson proce with parameter A, fipd the pmf for the number of customers that arrive during one customer’s service time. Hint: Condition on the service time. is an exponential random variable with parameter 6 (0) a) Evaluate the pmf in part a if Is the difference of two independent Poisson random processes also a Poisso: process? (b) Let N,(t) be the number of complete pairs generated by a Poisson process up to” time 1. Explain why N,(t) is or is not a Poisson process. Let N(r) be a Poisson random process with parameter A. Suppose that each time an event occurs, a coin is flipped and the outcome (heads or tails) is recorded. Let N,(r) and N2(1)_ denote the number of heads and tails recorded up to-time 1, respectively. Assume that p the probability of heads. (a) Find P[N;(t) = j, No(t) = kIN(t) = k + (b) Use part a to show that N;(t) and N2(z) are independent Poisson random variable: of rates pat and (1 — p)At, respectively: N,(t) = j, No(t) = 2 eti=pya, Customers play a $1 game machine according to a Poisson process with rate A. Suppost the machine dispenses a random reward X each time it is played. Let X(¢) be the total reward issued up to time t. (a) Find expressions for P[X(r) = j] if X, is Bernoulli. (b) Repeat part a if X assumes the values {0, 5} with probabilities (5/6, 1/6). Problems 563 {c) Repeat part a if X is Poisson with mean 1. (d) Repeat part a if with probability p the machine returns all the coins, 9.44, Let X() denote the random telegraph signal, and let ¥(t) be a process derived from X(#) as follows: Each time X(#) changes polarity, Y(#) changes polarity with probability p. (a) Find the P[Y(r) = +1] (b) Find the autocovariance function of Y(¢). Compare it to that of X(2) 9.45, Let Y(¢) be the random signal obtained by switching between the values 0 and 1 accord- ing to the events in a Poisson process of rate A. Compare the pmf and autocovariance of Y(2) with that of the random telegraph signal. 9.46. Let Z(t) be the random signal obtained by switching between the values 0 and 1 accord- ing to the events in a counting process N(f). Let PIN() =k) - Acs) k=0,1,2,.... 1+ At\l + At (a) Find the pmf of Z(). (b) Find mz(2). 9.47. In the filtered Poisson process (Eq. (9.45)), let h(¢) be a pulse of unit amplitude and dura- tion T seconds. (a) Show that X(¢) is then the increment in the Poisson process in the interval (¢ — T, 6). (b) Find the mean and autocorrelation functions of X(t). 9.48. (a) Find the second moment and variance of the shot noise process discussed in Example 9.25. (b) Find the variance of the shot noise process if h(t) = e* for r = 0. 9.49. Messages arrive at a message center according to a Poisson process of rate A. Every hour the messages that have arrived during the previous hour are forwarded to their destination. Find: the mean of the total time waited by all the messages that arrive during the hour. Hint: Condition on the number of arrivals and consider the arrival instants. Section 9.5: Gaussian Random Process, Wiener Process and Brownian Motion 9.50. Let X(¢) and Y(z) be jointly Gaussian random processes. Explain the relation be- tween the conditions of independence, uncorrelatedness, and orthogonality of X(1) and Y(t). 9.51. Let X(1) be a zero-mean Gaussian random process with autocovariance function given by Cxltrs te) = 4e 28, Find the joint pdf of X(#) and X(¢ + s). 9,52. Find the pdf of Z(/) in Problem 9.13 if X and Y are jointly Gaussian random variables. . Let ¥(t) = X(t + d) ~ X(t), where X(0) is a Gaussian random process. (a) Find the mean and autocovariance of Y(/). (b) Find the pdf of ¥(0). (©) Find the joint pdf of ¥(¢) and ¥(¢ + s). (d) Show that Y(2) is a Gaussian random process. 564 Chapter 9 9.54, 9.55. 9.56. 9.57. 9.58. 9.59. 9.60. Random Processes Let X(t) = Acos wt + B sin wt, where A and B are iid Gaussian random variables with zero mean and variance a”. (a) Find the mean and autocovariance of X() (b) Find the joint pdf of X(0 and X(¢ + s). Let X(t) and ¥(¢) be independent Gaussian random processes with zero means and the same covariance function C(t, /2). Define the “amplitude-modulated signal” by Z(t) = X(0) cos at + ¥(¢) sin ot. (a) Find the mean and autocovariance of Z(/). (b) Find the pdf of Z(0). Let X(@) be a zero-mean Gaussian random process with autocovariance function given by Cy(t,, f). IEX() is the input to a “square law detector,” then the output is Y(t) = X(t)”. Find the mean and autocovariance of the output Y(0) Let ¥(t) = X(t) + yt, where X(2) is the Wiener process. (a) Find the pdf of YO. (b) Find the joint pdf of ¥(0) and ¥(¢ + s). Let ¥(t) = X%(r), where X(¢) is the Wiener process. (a) Find the pdf of ¥(0). (b) Find the conditional pdf of Y(t) given Y(t). Let Z(t) = X(t) ~ aX(t — s), where X(¢) is the Wiener process. (a) Find the pdf of Z(t). (b) Find mz(t) and Cz(t;, 12). (a) For X(¢) the Wiener process with a = 1 and 0 4 Ry(t) =4- ' (7) (a -Il) bist. Let X(t) = A cos(2zft), where A is a random variable with mean m and variance o?, (a) Evaluate r, find its limit as T > 00, and compare to (1) (b) Evaluate , find its limit as T— co, and compare to Ry(t + 7,1) Repeat Problem 9.94 with X(t) = A cos(2mft + @), where A is as in Problem 9.94, @ is a random variable uniformly distributed in (0,27), and A and @ are independent ran- dom variables. Find an exact expression for VAR[ 7] in Example 9.48. Find the limit as T —> 00, 9.97. The WSS random process X,, has mean m and autocovariance Cy(k) = (1/2)" Is X,, 9.98. 9.99, mean ergodic? (a) Are the moving average processes Y,, in Problem 9.24 mean ergodic? (b) Are the autoregressive processes Z,, in Problem 9.25a mean ergodic? (a) Show that a WSS random process is mean ergodic if [ieee < 00 (b) Show that a discrete-time WSS random process is mean ergodic if = IC(K)| < 00. 9.100. Let p denote a time-average estimate for the mean power of a WSS random process, (a) Under what conditions is this time average a valid estimate for E[X?(¢)]? (b) Apply your result in part a for the random phase sinusoid in Example 9.2. 9.101. (a) Under what conditions is the time average r a valid estimate for the autocorrelation Ry-(r) of a WSS random process X(t)? (b) Apply your result in part a for the random phase sinusoid in Example 9.2. 9.102. Let Y(¢) be the indicator function for the event {a < X(t) r is the proportion of time in the time interval (~T,7) that X (te (a,b).

You might also like