A player plays a game in which, during each round, he has a probability 0.55 of winning $1 and probability 0.45 of losing $1. Th
ese probabilities do not change from round to round, and the outcomes of rounds are independent. The game stops when either the player loses its money, or wins a fortune of $M. Assume M = 5, and the player starts the game with $1. (a) Model the player's wealth as a Markov chain and construct the probability transition matrix (b) What is the probability that the player goes broke after 3 rounds of play?