Markov chains are a relatively simple but very interesting and useful class of random
processes. A Markov chain describes a system whose state changes over time. The changes
are not completely predictable, but rather are governed by probability distributions. These
probability distributions incorporate a simple sort of dependence structure, where the conditional distribution of future states of the system, given some information about past
states, depends only on the most recent piece of information. That is, what matters in
predicting the future of the system is its present state, and not the path by which the
system got to its present state. Markov chains illustrate many of the important ideas of
stochastic processes in an elementary setting. This classical subject is still very much alive,
with important developments in both theory and applications coming at an accelerating
pace in recent decades.1
Tomorrow's Weather Forecast (Central Park, NY 10024)