TD1MDBS
TD1MDBS
TD1MDBS
Master 1 2020/2021
TRAVAUX DIRIGES
FICHE 1
CHAINES DE MARKOV ET FILES D’ATTENTE
Exercice 1 (Transforming a Process into a Markov chain) Suppose that wether or not it rains
today depends on previous weather conditions through the last two days. That is, suppose that
if it has rained for the past two days, then it will rain tomorrow with probability .7; if it rained
today but not yesterday, it will rain tomorrow with probability .5; if it rained yesterday but not
today, then it will rain tomorrow with probability .4; if it has not rained the past two days, then
it will rain tomorrow with probability .2.
1) Why if we let the state at time n depend only on wether or not it is raining at time n,
then the above model is not a Markov chain?
2) Now we set
state 0: it rained both today and yesterday;
state 1: it rained today but not yesterday;
state 2: it rained yesterday but not today;
state 3: it did not rain either yesterday or today.
Prove that by using these states the model becomes a Markov chain with space of states
S = {0, 1, 2, 3} and transition matrix
.7 0 .3 0
.5 0 .5 0
P =
0 .4 0 .6
0 .2 0 .8
3 Draw the graph of transitions and make the classification of states.
4) Given that it rained on Monday and Tuesday, what is the probability that it will rain on
Thursday?
5) We denote by α = (α0 , α1 , α2 , α3 ) the initial distribution of the Process that is
αj = P(X0 = j)
Suppose that α0 = .4, α1 = .6. Find the probability that it will rain four days after we
begin keeping weather records. Remember that if µ(n) denote the distribution probability
(n)
of the random variable Xn then we have µ(n) = αP n where µj = P(Xn = j) So you must
calculate P(X4 = 0) (Explain why)
Exercice 3 (Gambler’s ruin) A gambler plays a game in which on each play he wins one dollar
with probability 1/2 and loses one dollar with probability 1/2. He plays until he gets a fortune of
4 dollars or is ruined.
1) Modelize this random phenomena by a Markov chain
2) Give the probability of ruin of the gambler if he starts with a fortune of 2 dollars and also
the probability that he reachs his goal of 4 dollars.
3) Calculate the average duration of this game.
4) Without any calculation, compare the probability of ruin of the gambler starting from
initial fortune 1 dollar and the probability of reaching his goal starting from initial fortune
3 dollars.
Exercice 4
Soit une chaine de Markov (Xn , n ∈ N) à valeurs dans E = {1, 2, 3, 4}, de matrice de transition
1 0 0 0
p p
2 1−p−q q 2
Q= 0
, (p, q > 0, p + q < 1)
0 1 0
q 0 p 1−p−q
et d’état initial 2
(1) Construire le graphe de la chaine. En déduire les différentes classes et leur nature.
(2) Soit T instant de rentré dans C = {1, 3}.
(a) Ecrire T en fonction de la suite (Xn )n≥0 . En déduire les valeur prise par T .
(b) Déterminer l’espétance de T . Pour quelle valeur de p et q elle est maximale.
(3) Montrer que XT est une variable aléatoire et que Xt = XT si t ≥ T .
Exercice 5 La richesse d’un joueur de casino est décrite par une marche aléatoire {Sn : n ≥ 0}.
BON COURAGE