Answer:
Probability distribution vector =
Step-By-Step Explanation
If
is the transition matrix for a Markov chain with two states.
be the initial state vector for the population.
In the long run, the probability distribution vector Xm approaches the probability distribution vector
.
This is called the steady-state (or limiting,) distribution vector.
Answer:
The answer would be 50
Step-by-step explanation: