]> The Branching Chain

## 9. Branching Chain

#### Introduction

Generically, suppose that we have a system of particles that can generate or split into other particles of the same type. Here are some typical examples:

• The particles are biological organisms that reproduce.
• The particles are neutrons in a chain reaction.
• The particles are electrons in an electron multiplier.

We assume that each particle, at the end of its life, is replaced by a random number of new particles that we will refer to as children of the original particle. Our basic assumption is that the particles act independently, each with the same offspring distribution on . Let $f$ denote the common probability density function of the number of offspring of a particle. We will also let $f n f f f$ denote the convolution power of degree $n$ of $f$; this is the probability density function of the total number of children of $n$ particles.

We will not yet try to study the evolution of the system in real time, but rather in generational time. Specifically, the particles that we start with are in generation 0, and recursively, the children of a particle in generation $n$ are in generation $n 1$.

Let $X n$ denote the number of particles in generation $n$. One way to define the process mathematically is to start with an array of independent random variables $U n i$ where $n$ and $i$, each with probability density function $f$. We interpret $U n i$ as the number of children of the $i th$ particle in generation $n$ (if this particle exists). Note that we have more random variables than we need, but this causes no harm, and we know that we can construct a probability space that supports such an array of random variables. We can now define our state variables recursively by

$X n 1 i 1 X n U n i , n$

Show that $X X 0 X 1$ is a Markov chain on  with transition matrix $P$ given by

$P x y f x y , x y 2$

Note that the descendants of each initial particle form a branching chain, and these chains are independent. Thus, the branching chain starting with $x$ particles is equivalent to $x$ independent copies of the branching chain starting with 1 particle. This features turns out to be very important in the analysis of the chain. Note also that 0 is an absorbing state that corresponds to extinction. Computing the probability of extinction is one of the fundamental problems in branching chains; we will essentially solve this problem in the next section.

Explicitly find the one-step transition matrix $P$ for each of the following offspring distributions:

1. The Poisson distribution with parameter $m 0$.
2. The binomial distribution with trial parameter $k$ and success parameter $p 0 1$.
3. The geometric distribution on  with success parameter $1 p 0 1$.

Consider the branching chain with offspring probability density function given by $f 0 1 p$, $f 2 p$ where $p 0 1$ is a parameter. Thus, each particle either dies or splits into two new particles. Find the probability density function of $X 1 X 2 X 3$ starting with 1 particle.

#### Extinction and Explosion

Let $m$ denote the mean of the offspring distribution. Show that

1. $X n 1 m X n$ for $n$
2. $X n m n X 0$ for $n$
3. $X n 0$ as $n$ (extinction in the mean) if $m 1$.
4. $X n X 0$ for each $n$ (stability in the mean) if $m 1$.
5. $X n$ as $n$ (explosion in the mean) if $m 1$ and $X 0 0$.

Recall that state 0 is absorbing (there are no particles), and hence $n X n 0 T 0$ is the extinction event (where as usual, $T 0$ is the time of the first return to 0). We are primarily concerned with the probability of extinction, as a function of the initial state. First, however, we will make some simple observations and eliminate some trivial cases.

Suppose that $f 1 1$, so that each particle is replaced by a single new particle. Show that

1. Every state is absorbing.
2. The equivalence classes are the singleton sets.
3. $X n X 0$ for every $n$

Suppose now that $f 0 0$ so that it is possible for a particle to simply die without offspring. Show that

1. Every state leads to 0.
2. Every positive state is transient.
3. With probability 1 either $X n 0$ for some $n$ (extinction) or $X n$ as $n$ (explosion).

Suppose now that $f 0 0$ and $f 1 1$, so that every particle is replaced by at least one particle, and possibly more than one. Show that explosion is certain, starting with at least one particle. That is, $X n as n X 0 x 1$ for every $x$.

Suppose now that $f 0 0$ and $f 0 f 1 1$, so that it is possible for a particle to die without offspring, and a particle is not replaced by more than one particle. Show that with probability 1, $X n 0$ for some $n$, so extinction is certain.

Thus, the interesting case is when $f 0 0$ and $f 0 f 1 1$, so that it is possible for a particle to die without offsrping and also for the particle to be replaced by more than one new particles. We will assume these conditions for the remainder of our discussion. By Exercise 6, all states lead to 0 (extinction). We will denote the probability of extinction, starting with one particle, by

$q T 0 X 0 1 n X n 0 X 0 1$

Show that  is a transient equivalence class, and the probability of extinction starting with $x$ particles is

$q x T 0 X 0 x n X n 0 X 0 x$

Condition on the first state to establish the following equation:

$q x 0 f x q x$

Thus the extinction probability $q$ starting with 1 particle is a fixed point of the probability generating function $Φ t x 0 f x t x$ of the offspring distribution. Moreover, from the general discussion of hitting probabilities in the section on recurrence and transience, $q$ is the smallest such number in the interval $0 1$.

Show that

1. $Φ 0 f 0$
2. $Φ 1 1$
3. $Φ t 0$ for $t 0 1$ so $Φ$ in increasing on $0 1$
4. $2 Φ t 0$ for $t 0 1$ so $Φ$ in concave upward $0 1$
5. $Φ 1 m$

Based on the previous exercise and the graphs below, conclude that

1. If $m 1$ then $q 1$, so extinction is certain.
2. If $m 1$ then $0 q 1$, so there is a positive probability of extinction and a positive probability of explosion.

Find the mean $m$ of each of the following offspring probability density functions. Find the extinction probability $q$ starting with 1 particle.

1. $f n 1 p p n$ for $n$ where $p 0 1$ is a parameter (the geometric distribution on ).
2. $f 0 1 p$, $f 2 p$ where $p 0 1$ is a parameter. Thus, each particle either dies or splits into two new particles.