The study of Markov chains, particularly the limiting behavior, depends critically on the random times between visits to a given state. The nature of this random times leads to a fundamental dichotomy of the states.

As usual, our starting point is a (time homogeneous) Markov chain $X=\left(\begin{array}{c}X_{0}\\ X_{1}\\ X_{2}\\ \end{array}\right)$ with state space $S$ and transition probability matrix $P$.

Let $A$ be a nonempty subset of $S$. Recall that the hitting time to $A$ is the random variable that gives the first positive time that the chain is in $A$:

$$T_{A}=\min\{X_{n}\in A\mid X_{n}\in A\}$$Since the chain may *never* enter
$A$,
the random variable
$T_{A}$
takes values in
$\cup \{\infty \}$.
When
$A=\{y\}$,
we will simplify the notation to
$T_{y}$.
This random variable gives the first positive time that the chain is in state
$y$. Now for
$x\in S$
and
$n\in $,
let

Note that $H(x, A)=\sum_{n=1}^{\infty} H_{n}(x, A)$

Again, when $A=\{y\}$, we will simplify the notation to $H_{n}(x, y)$ and $H(x, y)$, respectively. In particular, $H(x, x)$ is the probability, starting at $x$, that the chain eventually returns to $x$. If $x\neq y$, $H(x, y)$ is the probability, starting at $x$, that the chain eventually reaches $y$. Just knowing when $H(x, y)$ is 0, positive, and 1 will turn out to be of considerable importance in the overall structure and limiting behavior of the chain. As a function on $S^{2}$, we will refer to $H$ as the hitting matrix of $X$.

Show that $H(x, y)> 0$ if and only if $P^{n}(x, y)> 0$ for some $n\in $.

- Show that $\{X_{n}=y\}\subseteq \{T_{y}< \infty \}$ for all $n\in $.
- Show that $\{T_{y}< \infty \}=\{\exists k\in \colon X_{k}=y\}$.
- Use basic rules of probability and Boole's inequality to conclude that $P^{n}(x, y)\le H(x, y)\le \sum_{k=1}^{\infty} P^{k}(x, y)$ for all $n\in $.

The following exercise gives a basic relationship between the sequence of hitting probabilities and the sequence of transition probabilities.

Suppose that $\left(\begin{array}{c}x\\ y\end{array}\right)\in S^{2}$. Condition on $T_{y}$ to show that

$$P^{n}(x, y)=\sum_{k=1}^{n} H_{k}(x, y)P^{(n-k)}(y, y)\text{,\hspace{1em}}n\in $$Starting in state $x$, the chain is in state $y$ at time $n$ if and only if the chain hits $y$ for the first time at some previous time $k$, and then returns to $y$ in the remaining $n-k$ steps.

Suppose that $x\in S$ and $A\subseteq S$. Condition on $X_{1}$ to show that

- $H_{n+1}(x, A)=\sum_{z\notin A} P(x, z)H_{n}(z, A)$ for $n\in $
- $H(x, A)=P(x, A)+\sum_{z\notin A} P(x, z)H(z, A)$

Starting in state $x$, the chain first enters $A$ at time $n+1$ if and only if the chain goes to some state $z\notin A$ at time 1, and then from state $z$, first enters $A$ in $n$ steps. Similarly, starting in state $x$, the chain eventually enters $A$ if and only if it either enters $A$ at the first step, or moves to some other state $z\notin A$ at the first step, and then eventually enters $A$ from $z$.

A state $x$ is said to be recurrent if $H(x, x)=1$ and is said to be transient if $H(x, x)< 1$. Thus, starting in a recurrent state, the chain will, with probability 1, eventually return to the state. As we will see, the chain will return to the state infinitely often with probability 1, and the times of the visits will form the arrival times of a renewal process. This will turn out to be the critical observation in the study of the limiting behavior of the chain. By contrast, if the chain starts in a transient state, then there is a positive probability that the chain will never return to the state.

Again, suppose that $A$ is a nonempty set of states. A natural complement to the hitting time to $A$ is the counting variable that gives the number of visits to $A$ (at positive times). Thus, let

$$N_{A}=\sum_{n=1}^{\infty} (X_{n}\in A)$$We will mostly be interested in the special case $A=\{x\}$, and in this case, we will simplify the notation to $N_{x}$. Note that $N_{A}$ takes value in $\mathbb{N}\cup \{\infty \}$.

Let $G(x, A)=(N_{A}, X_{0}=x)$ for $x\in S$ and $A\subseteq S$. Show that

$$G(x, A)=\sum_{n=1}^{\infty} P^{n}(x, A)$$Consistent with our notation in the Introduction,
$G$ is a transition measure on
$S$ (although, of course, not a transition *probability* measure). Show that

The matrix $G$ is known as the potential matrix. (Some authors count the state at time 0 and thus the index $n$ in the definition of the counting variable and in Exercise 5 starts at 0 rather than 1.)

The distribution of $N_{A}$ has a simple representation in terms of the hitting probabilities. Note that because of the Markov property and time homogenous property, whenever the chain reaches state $y$, the future behavior is independent of the past and is stochastically the same as the chain starting in state $y$ at time 0. This is the critical observation in the proof of the following exercise.

Suppose that $x$ and $y$ are states. Show that

$$(N_{y}=n, X_{0}=x)=\begin{cases}1-H(x, y) & \text{if $n=0$}\\ H(x, y)H(y, y)^{(n-1)}(1-H(y, y)) & \text{if $n\in $}\end{cases}$$The essence of the proof is illustrated in the graphic above. The thick lines are intended as reminders that these are not one step transitions, but rather represent all paths between the given vertices. Note that in the special case that $x=y$ we have

$$(N_{y}=n, X_{0}=x)=H(y, y)^{n}(1-H(y, y))\text{\hspace{1em}}n\in \mathbb{N}$$In all cases, the counting variable $N_{y}$ has essentially a geometric type distribution, but the distribution may well be defective, with some of the probability mass at $\infty $. The behavior is quite different depending on whether $y$ is transient or recurrent.

Suppose that $x$ and $y$ are states and that $y$ is transient. Show that

- $(N_{y}< \infty , X_{0}_{}=x)=1$
- $G(x, y)=\frac{H(x, y)}{1-H(y, y)}$
- $H(x, y)=\frac{G(x, y)}{1+G(y, y)}$

Suppose that $x$ and $y$ are states and that $y$ is recurrent. Show that

- $(N_{y}=0, X_{0}_{}=x)=1-H(x, y)$ and $(N_{y}=\infty , X_{0}_{}=x)=H(x, y)$
- $G(x, y)=0$ if $H(x, y)=0$ and $G(x, y)=\infty $ if $H(x, y)> 0$
- $(N_{y}=\infty , X_{0}_{}=y)=1$ and $G(y, y)=\infty $

Note that there is an invertible relationship between the hitting probability matrix $H$ and the potential matrix $G$; if we know one we can compute the other. In particular, we can characterize the transience or recurrence of a state in terms of $G$:

- State $x$ is transient if and only if $H(x, x)< 1$ if and only if $G(x, x)< \infty $.
- State $x$ is recurrent if and only if $H(x, x)=1$ if and only if $G(x, x)=\infty $.

The hitting probabilities lead to an important relation on the state space
$S$. For
$\left(\begin{array}{c}x\\ y\end{array}\right)\in S^{2}$,
we say that
$x$ leads to
$y$ and we write
$(x, y)$
if either
$x=y$
or
$H(x, y)> 0$.
It follows immediately from Exercise 2 that
$(x, y)$
if and only if
$P^{n}(x, y)> 0$
for some
$n\in \mathbb{N}$.
In terms of the graph of the chain,
$(x, y)$
if and only if there is a directed path from
$x$ to
$y$. Note that the *leads to* relation is reflexive by definition:
$(x, x)$
for any
$x\in S$.

Show that the *leads to* relation is transitive: if
$(x, y)$
and
$(y, z)$
then
$(x, z)$.

- There exist $j\in \mathbb{N}$ and $k\in \mathbb{N}$ such that $P^{j}(x, y)> 0$ and $P^{k}(y, z)> 0$
- Show that $P^{(j+k)}(x, z)> 0$

A nonempty set of states $A$ is closed if $x\in A$ and $(x, y)$ implies $y\in A$. A closed set $A$ is irreducible if $A$ has no proper closed subset.

Suppose that $A\subseteq S$ is closed.

- Show that $P_{A}$, the restriction of $P$ to $A\times A$, is a transition probability matrix on $A$.
- Note that $X$ restricted to $A$ is a Markov chain with transition probability matrix $P_{A}$.
- Show that $P^{n}_{A}=P_{A}^{n}$ for $n\in \mathbb{N}$

Of course, the entire state space $S$ is closed by definition. If it is also irreducible, we say the Markov chain $X$ is irreducible.

Suppose that $A$ is a nonempty subset of $S$. Show that $(A)=\{y\in S\colon \exists x\in A\colon (x, y)\}$ is the smallest closed set containing $A$, and is called the closure of $A$. That is, show that

- $(A)$ is closed.
- $A\subseteq (A)$
- If $B$ is closed and $A\subseteq B$ then $(A)\subseteq B$

Recall that for a fixed positive integer
$k$,
$P^{k}$
is also a transition probability matrix, and in fact governs the $k$-step Markov chain.
$\left(\begin{array}{c}X_{0}\\ X_{k}\\ X_{2k}\\ \end{array}\right)$.
It follows that we could consider the *leads to* relation for this chain, and all of the results above would still hold (relative, of course to the
$k$-step chain). Occasionally we will need to consider this relation, which we will denote by
$\underset{k}{}$,
particularly in our study of periodicity.

Suppose that $j$ and $k$ are positive integers. Show that if $\underset{k}{}(x, y)$ and $j | k$. then $\underset{j}{}(x, y)$.

By combining the *leads to* relation
$$
with its inverse, the comes from relation
$$,
we can obtain another very useful relation. For
$\left(\begin{array}{c}x\\ y\end{array}\right)\in S^{2}$,
we say that
$x$ to and from
$y$ and we write
$(x, y)$
if
$(x, y)$
and
$(y, x)$.
By definition, this relation is symmetric: if
$(x, y)$
then
$(y, x)$.
From our work above, it is also reflexive and transitive. Thus, the *to and from* relation is an equivalence relation. Like all equivalence relations, it partitions the space into mutually disjoint equivalence classes. We will denote the equivalence class of a state
$x$ by

Thus, for any two states $x$ and $y$, either $(x)=(y)$ or $(x)\cap (y)=\emptyset $, and moreover, $x\in S\cup (x)=S$

Draw state graphs to illustrate the following facts:

- A closed set is not necessarily an equivalence class.
- An equivalence class is not necessarily closed.

On the other hand, show that if $A$ is a closed, irreducible set of states, then $A$ is an equivalence class.

- Fix $x\in A$ and conclude from closure that $(x)\subseteq A$.
- For arbitrary $y\in A$, use irreducibility to argue that $(x)=A$ and $(y)=A$ and therefore $(x, y)$.
- Conclude from (b) that $A\subseteq (x)$.

The *to and from* equivalence relation is very important because many interesting state properties turn out in fact to be class properties, shared by all states in a given equivalence class. In particular, the recurrence and transience properties are class properties.

The following exercise gives the fundamental result of this section: a recurrent state can only lead to other recurrent states.

Suppose that $x$ is a recurrent state and that $(x, y)$. Show that $y$ is recurrent and that $H(x, y)=H(y, x)=1$.

- Show first that the result holds if $x=y$, so henceforth assume that $x\neq y$
- Let $\alpha (x, y)$ denote the probability, starting at $x$, that the chain reaches $y$ without an intermediate return to $x$. Argue that $\alpha (x, y)> 0$ since $(x, y)$. In terms of the graph of $X$, if there is a path from $x$ to $y$, then there is a path from $x$ to $y$ without cycles.
- Starting at $x$, the chain could fail to return to $x$ by first reaching $y$ without an intermediate return to $x$, and then from $y$ never reaching $x$. Use the Markov and time homogeneous properties to argue that $1-H(x, x)\ge \alpha (x, y)(1-H(y, x))\ge 0$.
- Conclude from (c) that $H(y, x)=1$.
- Show that there exist positive integers $j$ and $k$ such that $P^{j}(x, y)> 0$ and $P^{k}(y, x)> 0$.
- Show that $P^{j+k+n}(y, y)\ge P^{k}(y, x)P^{n}(x, x)P^{j}(x, y)$ for any $n\in \mathbb{N}$.
- Sum over $n$ in (f) to conclude that $G(y, y)=\infty $ and hence that $y$ is recurrent.
- Finally, reverse the roles of $x$ and $y$ to conclude that $H(x, y)=1$.

From the last exercise, note that if $x$ is recurrent, then all states in the equivalence class of $x$ are also recurrent. Thus, for each equivalence class, either all states are transient or all states are recurrent. We can therefore refer to transient or recurrent classes as well as states.

Suppose that $A$ is a recurrent equivalence class. Show that $A$ is closed and irreducible.

Suppose that $A$ is finite and closed. Show that $A$ has a recurrent state.

- Fix $x\in A$ and argue that $(N_{A}=\infty , X_{0}=x)=1$ since $A$ is closed.
- Argue that $(N_{y}=\infty , X_{0}=x)> 0$ for some $y\in A$ since $A$ is finite.
- Conclude that $y$ is recurrent.

Suppose that $A$ is finite, closed, and irreducible. Show that $A$ is a recurrent equivalence class.

- Note that $A$ is an equivalence class by Exercise 15.
- Note that $A$ has a recurrent state by Exercise 18.
- Conclude that all states in $A$ are recurrent.

Thus, the Markov chain $X$ will have a collection (possibly empty) of recurrent equivalence classes $\{A_{j}\colon j\in J\}$ where $J$ is a countable index set. Each $A_{j}$ is closed and irreducible. Let $B$ denote the set of all transient states. The set $B$ may be empty or may consist of a number of equivalence classes, but the class structure of $B$ is not important to us. If the chain starts in $A_{j}$ for some $j\in J$ then the chain remains in $A_{j}$ forever, visiting each state infinitely often with probability 1. If the chain starts in $B$, then the chain may stay in $B$ forever (but only if $B$ is infinite) or may enter one of the recurrent classes $A_{j}$, never to escape. However, in either case, the chain will visit a given transient state only finitely many time with probability 1. This basic structure is known as the canonical decomposition of the chain, and is shown in graphical form below. The edges from $B$ are in gray to indicate that these transitions may not exist.

Suppose that $A$ is a proper subset of $S$.

- Show that $(P_{A}^{n}1_{A})(x)=P_{A}^{n}(x, A)=(X_{1}\in A, X_{2}\in A, , X_{n}\in A, X_{0}=x)$ for $x\in A$. This is the probability that the chain stays in $A$ at least through time $n$, starting in state $x$.
- Show that $g_{A}(x)=\lim_{n\to \infty}(P_{A}^{n}1_{A})(x)=(X_{1}\in A, X_{2}\in A, , X_{0}=x)$ for $x\in A$, using (a) and the continuity theorem for decreasing events. This is the probability that chain stays in $A$ forever.

The staying probability function $g_{A}$ is an interesting complement to the hitting matrix studied above. The following exercise characterizes this function and provides a method that can be used to compute it, at least in some cases.

Show that $g_{A}$ is the largest function on $A$ that takes values in $\left[0 , 1\right]$ and satisfies $g=P_{A}g$. Moreover, either $g_{A}=0_{A}$ or $(g_{A}(x), x\in A)=1$

- Note that $P_{A}^{(n+1)}1_{A}=P_{A}P_{A}^{n}1_{A}$ for $n\in \mathbb{N}$
- Take the limit in (a) as $n\to \infty $ and use the bounded convergence theorem to conclude that $g_{A}=P_{A}g_{A}$
- Suppose now that $g$ is a function on $A$ that takes values in $\left[0 , 1\right]$ and satisfies $g=P_{A}g$. Argue that $g\le 1_{A}$ and hence that $g\le P_{A}^{n}1_{A}$ for all $n\in \mathbb{N}$.
- Take the limit in (c) as $n\to \infty $ to conclude that $g\le g_{A}$.
- Let $c=(g_{A}(x), x\in A)$. Argue that $g_{A}\le c1_{A}$ and hence $g_{A}\le cP_{A}^{n}1_{A}$ for each $n\in \mathbb{N}$.
- Take the limit as $n\to \infty $ to conclude that $g_{A}\le cg_{A}$.
- Conclude that either $c=0$ or $c=1$

Note that the characterization in the last exercise includes a zero-one law of sorts: either the probability that the chain stays in $A$ forever is 0 for every initial state $x$, or we can find states for which the probability is arbitrarily close to 1. The next two exercises explore the relationship between the staying function and recurrence.

Suppose that $X$ is an irreducible, recurrent chain with state space $S$. Show that $g_{A}=0_{A}$ for any proper subset $A$ of $S$.

- Fix $y\notin A$ and argue that $0\le g_{A}(x)\le 1-H(x, y)$ for every $x\in A$
- Argue that $g_{A}(x)=0$ since the chain is irreducible and recurrent.

Suppose that $X$ is an irreducible Markov chain with state space $S$ and transition probability matrix $P$. Show that if there exists a state $x$ such that $g_{A}=0_{A}$ where $A=S\setminus \{x\}$ then $X$ is recurrent.

- With $A$ as defined above, show that $1-H(x, x)=\sum_{y\in A} P(x, y)g_{A}(y)$.
- Conclude that $x$ is recurrent and hence that the chain is recurrent.

More generally, suppose that $X$ is a Markov chain with state space $S$ and transition probability matrix $P$. The last two exercise can be used to test whether a closed, irreducible equivalence class $C$ is recurrent or transient. We fix a state $x\in C$ and set $A=C\setminus \{x\}$. We then try to solve the equation $g=P_{A}g$ on $A$. If the only solution taking values in $\left[0 , 1\right]$ is $0_{A}$, then the class $C$ is recurrent by Exercise 23. If there are nontrivial solutions, then $C$ is transient by Exercise 22. Often we try to choose $x$ to make the computations easy.

We now know quite a bit about Markov chains, and we can often classify the states and compute quantities of interest. However, we do not yet know how to compute:

- $G(x, y)$ when $x$ and $y$ are transient
- $H(x, y)$ when $x$ is transient and $y$ is transient or recurrent.

These problems are related, because of the general inverse relationship between the hitting matrix and the potential matrix noted in our discussion above. As usual, suppose that $X$ is a Markov chain with state space $S$, and let $B$ denote the set of transient states. The next exercise shows how to compute $G_{B}$, the potential matrix restricted to the transient states. Recall that the values of this matrix are finite.

Show that $G_{B}$ satisfies the equation $G_{B}=P_{B}+P_{B}G_{B}$ and is the smallest nonnegative solution. If $B$ is finite, $G_{B}=I_{B}-P_{B}^{(-1)}P_{B}$.

- Argue that $P^{n}_{B}=P_{B}^{n}$ since a path between two transient states can only pass through other transient states.
- Conclude that $G_{B}=\sum_{n=1}^{\infty} P_{B}^{n}$
- Use the monotone convergence theorem to show that $P_{B}G_{B}=G_{B}-P_{B}$.
- Suppose that $U$ is a nonnegative matrix on $B$ satisfying $U=P_{B}+P_{B}U$. Show that $U=\sum_{k=1}^{n} P_{B}^{k}+P_{B}^{n+1}U$ for each positive integer $n$.
- Conclude that $U\ge \sum_{k=1}^{n} P_{B}^{k}$ for every positive integer $n$ and hence $U\ge G_{B}$.
- Show that $(I_{B}-P_{B})(I_{B}+G_{B})=I_{B}$. Conclude that if $B$ is finite, the matrix $I_{B}-P_{B}$ is invertible.

Now that we can compute $G_{B}$, we can also compute $H_{B}$ using the result in Exercise 8. All that remains is for us to compute the hitting probability $H(x, y)$ when $x$ is transient and $y$ is recurrent. The first thing to notice is that the hitting probability is a class property.

Suppose that $x$ is transient and that $A$ is a recurrent class. Show that $H(x, y)=(T_{A}< \infty , X_{0}=x)$ for $y\in A$. That is, the hitting probability to $y$ is constant for $y\in A$, and is just the hitting probability to the class $A$.

As before, let $B$ denote the set of transient states and suppose that $A$ is a recurrent equivalence class. Let $h_{A}$ denote the function on $B$ that gives the hitting probability to class $A$, and let $p_{A}$ denote the function on $B$ that gives the probability of entering $A$ on the first step:

$$h_{A}(x)=(T_{A}< \infty , X_{0}=x)\text{,\hspace{0.5em}}p_{A}(x)=P(x, A)\text{;\hspace{1em}}x\in B$$Show that $h_{A}=p_{A}+G_{B}p_{A}$.

- Show that $(T_{A}=n, X_{0}=x)=(P_{B}^{n-1}p_{A})(x)$ for $n\in $.
- Sum over $n$.

The result in the last exercise is adequate if we have already computed $G_{B}$ (using the result in Exercise 24, for example). However, we might just want to compute $h_{A}$ directly.

Show that $h_{A}$ satisfies the equation $h_{A}=p_{A}+P_{B}h_{A}$ and is the smallest nonnegative solution. If $B$ is finite, $h_{A}=I_{B}-P_{B}^{(-1)}p_{A}$.

- Show that $h_{A}=p_{A}+P_{B}h_{A}$ by conditioning on $X_{1}$.
- Suppose that $h$ is nonnegative and satisfies $h=p_{A}+P_{B}h$. Show that $h=p_{A}+\sum_{k=1}^{n-1} P_{B}^{k}p_{A}+P_{B}^{n}h$ for each positive integer $n$.
- Conclude that $h\ge p_{A}+\sum_{k=1}^{n-1} P_{B}^{k}p_{A}$
- Take the limit as $n\to \infty $ to conclude that $h\ge h_{A}$.
- Note that the representation when $B$ is finite follows from Exercise 24.

Consider a Markov chain with state space $S=\{a, b, c, d\}$ and transition matrix $P$ given below:

$$P=\begin{pmatrix}1/3 & 2/3 & 0 & 0\\ 1 & 0 & 0 & 0\\ 0 & 0 & 1 & 0\\ 1/4 & 1/4 & 1/4 & 1/4\end{pmatrix}$$- Draw the state diagram.
- Find the equivalent classes and classify each as transient or recurrent.
- Compute the potential matrix $G$.
- Compute the hitting matrix $F$.

Consider a Markov chain with state space $S=\{1, 2, 3, 4, 5, 6\}$ and transition matrix $P$ given below:

$$P=\begin{pmatrix}0 & 0 & 1/2 & 0 & 1/2 & 0\\ 0 & 0 & 0 & 0 & 0 & 1\\ 1/4 & 0 & 1/2 & 0 & 1/4 & 0\\ 0 & 0 & 0 & 1 & 0 & 0\\ 0 & 0 & 1/3 & 0 & 2/3 & 0\\ 0 & 1/4 & 1/4 & 1/4 & 0 & 1/4\end{pmatrix}$$- Sketch the state graph.
- Find the equivalence classes and classify each as recurrent or transient.
- Compute the potential matrix $G$.
- Compute the hitting matrix $H$.

Consider a Markov chain with state space $S=\{1, 2, 3, 4, 5, 6\}$ and transition matrix $P$ given below:

$$P=\begin{pmatrix}1/2 & 1/2 & 0 & 0 & 0 & 0\\ 1/4 & 3/4 & 0 & 0 & 0 & 0\\ 1/4 & 0 & 1/2 & 1/4 & 0 & 0\\ 1/4 & 0 & 1/4 & 1/4 & 0 & 1/4\\ 0 & 0 & 0 & 0 & 1/2 & 1/2\\ 0 & 0 & 0 & 0 & 1/2 & 1/2\end{pmatrix}$$- Sketch the state graph.
- Find the equivalence classes and classify each as recurrent or transient.
- Compute the potential matrix $G$.
- Compute the hitting matrix $H$.

Read again the definitions of the Ehrenfest chains and the Bernoulli-Laplace chains. Note that since these chains are irreducible and have finite state spaces, they are recurrent.

Read the discussion on recurrence in the section on the reliability chains and work the exercises.

Read the discussion on random walks on $\mathbb{Z}^{k}$ in the section on the random walks on graphs and work the exercises.

Read the discussion on extinction and explosion in the section on the branching chain and work the exercises.

Read the discussion on recurrence and transience in the section on queuing chains and work the exercises.

Read the discussion on recurrence and transience in the section on birth-death chains and work the exercises.