Starting from:

$25

CS70-DISC 14A Solved

1        Chebyshev’s Inequality vs. Central Limit Theorem
Let n be a positive integer. Let X1,X2,...,Xn be i.i.d. random variables with the following distribution:

                                      ;          ;          .

(a)    Calculate the expectations and variances of X1, , and

∑ (

                                                                                Zn =       in=1 pXi −E[Xi]).

n/2

(b)   Use Chebyshev’s Inequality to find an upper bound b for P[|Zn| ≥ 2].

(c)    Can you use b to bound P[Zn ≥ 2] and P[Zn ≤ −2]?

(d)   As n → ∞, what is the distribution of Zn?

(e)    We know that if Z ∼ N (0,1), then P[|Z| ≤ 2] = Φ(2)−Φ(−2) ≈ 0.9545. As n → ∞, can you provide approximations for P[Zn ≥ 2] and P[Zn ≤ −2]?

2        Markov Chain Basics
A Markov chain is a sequence of random variables Xn, n = 0,1,2,.... Here is one interpretation of a Markov chain: Xn is the state of a particle at time n. At each time step, the particle can jump to another state. Formally, a Markov chain satisfies the Markov property:

                                          P(Xn+1 = j | Xn = i,Xn−1 = in−1,...,X0 = i0) = P(Xn+1 = j | Xn = i),               (1)

for all n, and for all sequences of states i0,...,in−1,i, j. In other words, the Markov chain does not have any memory; the transition probability only depends on the current state, and not the history of states that have been visited in the past.

(a)    In lecture, we learned that we can specify Markov chains by providing three ingredients: X , P, and π0. What do these represent, and what properties must they satisfy?

(b)   If we specify X , P, and π0, we are implicitly defining a sequence of random variables Xn, n = 0,1,2,..., that satisfies (??). Explain why this is true.

(c)    Calculate P(X1 = j) in terms of π0 and P. Then, express your answer in matrix notation. What is the formula for P(Xn = j) in matrix form?

3        Playing Blackjack
You are playing a game of Blackjack where you start with $100. You are a particularly risk-loving player who does not believe in leaving the table until you either make $400, or lose all your money.

At each turn you either win $100 with probability p, or you lose $100 with probability 1− p.

(a)    Formulate this problem as a Markov chain i.e. define your state space, transition probabilities, and determine your starting state.

(b)   Find the probability that you end the game with $400.

More products