The matrix \(P = (p_{ij})\) is called the transition matrix of the Markov chain.
p ij = P ( X n + 1 = j ∣ X n = i ) markov chains jr norris pdf
In conclusion, Markov chains are a fundamental concept in probability theory and have numerous applications in various fields. The book “Markov Chains” by J.R. Norris is a comprehensive resource for anyone looking to learn about Markov chains. The book covers the basic theory of Markov chains, as well as more advanced topics, and is aimed at graduate students and researchers. The matrix \(P = (p_{ij})\) is called the