Unlocking the Secrets of Markov Chains: Changing the Game Forever
Markov chains are sequences where the next step only depends on the current step. If the starting distribution stays the same, it's called stationary. Even if the starting distribution changes, it's not too complicated. The tricky part is dealing with chains that change over time. For a chain with distinct steps, the transition between steps can be shown in a matrix. The behavior of these changing chains can be predicted using different measures of how they settle into a pattern. Some chains with changing steps can still have a consistent starting point.