A renewal process is an idealized stochastic model for events

that occur randomly in time. These temporal events are generically referred to as renewals or arrivals. Here are some typical interpretations and applications.

- The arrivals are
customers

arriving at aservice station

. Again, the terms are generic. A customer might be a person and the service station a store, but also a customer might be a file request and the service station a web server. - A device is placed in service and eventually fails. It is replaced by a device of the same type and the process is repeated. We do not count the replacement time in our analysis; equivalently we can assume that the replacement is immediate. The times of the replacements are the renewals
- The arrivals are times of some natural event, such as a lightening strike or a tornado, at a particular geographical point.
- The arrivals are emissions of elementary particles from a radioactive source.

The basic model actually gives rise to several interrelated random processes: the sequence of interarrival times, the sequence of arrival times, a counting process, and several age

processes. In this section we will define and study the basic properties of each of these processes in turn.

Let $X_{i}$ denote the $(i)$ interarrival time, that is, the time between the $(i-1)$ and $(i)$ arrivals. Our basic assumption is that $X=\left(\begin{array}{c}X_{1}\\ X_{2}\\ \end{array}\right)$ is a sequence of independent, identically distributed, random variables. In statistical terms, $X$ corresponds to sampling from the distribution of a generic interarrival time $X$. We assume that $(0\le X< \infty )=1$ and $(X> 0)> 0$, so that the interarrival times are nonnegative, but not identically 0. Let $\mu =(X)$ denote the common mean of the interarrival times. We allow that possibility that $\mu =\infty $.

On the other hand, show that $\mu > 0$.

If $\mu < \infty $, we will let $\sigma ^{2}=\sigma(X)^2$ denote the common variance of the interarrival times. Let $F$ denote the common distribution function of the interarrival times, so that

$$F(x)=(X\le x)\text{,\hspace{1em}}x\in \left[0 , \infty \right)$$The distribution function $F$ turns out to be of fundamental importance in the study of renewal processes. We will let $f$ denote the probability density function of the interarrival times if the distribution is discrete or if the distribution is continuous and has a density function.

The renewal process is said to be periodic if there exists a positive number $d$ such that $(\exists n\in \mathbb{N}\colon X=nd)=1$. The largest such $d$ is the period.

Let

$$T_{n}=\sum_{i=1}^{n} X_{i}\text{,\hspace{1em}}n\in \mathbb{N}$$We follow our usual convention that the sum over an empty index set is 0; thus $T_{0}=0$. On the other hand, $T_{n}$ is the time of the $(n)$ arrival for $n\in $. The sequence $T=\left(\begin{array}{c}T_{0}\\ T_{1}\\ \end{array}\right)$ is called the arrival time process, although note that $T_{0}=0$ is not considered an arrival. A renewal process is so named because the process starts over, independently of the past, at each arrival time.

The sequence $T$ is the partial sum process associated with the independent, identically distributed sequence of interarrival times $X=\left(\begin{array}{c}X_{1}\\ X_{2}\\ \end{array}\right)$. Partial sum processes associated with independent, identically distributed sequences have been studied in several places in this project. In the remainder of this subsection, we will collect some of the more important facts about such processes.

First, we will let $F_{n}$ denote the distribution function of $T_{n}$, so that

$$F_{n}(t)=(T_{n}\le t)\text{,\hspace{1em}}t\in \left[0 , \infty \right)$$Recall that if $X$ has probability density function $f$ (in either the discrete or continuous case), then $T_{n}$ has probability density function $f_{n}=(f, n)=(f, f, , f)$, the $n$-fold convolution power of $f$.

Recall that the sequence of arrival times $T$ has stationary, independent increments:

- If $m\le n$ then $T_{n}-T_{m}$ has the same distribution as $T_{n-m}$. and thus has distribution function $F_{n-m}$.
- If $n_{1}\le n_{2}\le n_{3}\le $ then $\left(\begin{array}{c}T_{n_{1}}\\ T_{n_{2}}-T_{n_{1}}\\ T_{n_{3}}-T_{n_{2}}\\ \end{array}\right)$ is a sequence of independent random variables.

Show or recall that

- $(T_{n})=n\mu $
- $\sigma(T_{n})^2=n\sigma ^{2}$
- $(T_{m}, T_{n})=\min\{m , n\}\sigma ^{2}$

Recall the law of large numbers: $\frac{T_{n}}{n}\to \mu $ as $n\to \infty $:

- With probability 1 (the strong law).
- In probability (the weak law).

Show that $T_{n}\nearrow \infty $ as $n\to \infty $ with probability 1.

- Note that $T_{n}\le T_{n+1}$.
- Show that $(T_{n}=T_{n+1})=F(0)$. This can be positive, so with positive probability, more than one arrival can occur at the same time.
- Show that there exits $t> 0$ such that $(X> t)> 0$.
- Use the second Borel-Cantelli lemma to conclude that with probability 1, $X_{i}> t$ for infinitely many $i\in $.
- Conclude that $\sum_{i=1}^{\infty} X_{i}=\infty $ with probability 1.

For $t\ge 0$, let $N_{t}$ denote the number of arrivals in the interval $\left[0 , t\right]$:

$$N_{t}=\sum_{n=1}^{\infty} (T_{n}\le t)\text{,\hspace{1em}}t\ge 0$$We will refer to the random process $N=\{N_{t}\colon t\ge 0\}$ as the counting process.

Show that $N_{t}=\max\{T_{n}\le t\mid T_{n}\le t\}$ for $t\ge 0$.

Show that if $s\le t$ then $N_{t}-N_{s}$ is the number of arrivals in $\left(s , t\right]$.

Note that as a function of $t$, $N_{t}$ is a (random) step function. with jumps at the distinct values of $\{T_{1}, T_{2}, \}$; the size of the jump at an arrival time is the number of arrivals at that time. In particular, $N_{t}$ is an increasing function of $t$.

More generally, we can define the (random) counting measure corresponding to the sequence of random points $T=\left(\begin{array}{c}T_{1}\\ T_{2}\\ \end{array}\right)$ in $\left[0 , \infty \right)$. Thus, if $A$ is a (measurable) subset of $\left[0 , \infty \right)$, we will let $N(A)$ denote the number of the random points in $A$:

$$N(A)=\sum_{n=1}^{\infty} (T_{n}\in A)$$In particular, note that with our new notation,
$N(\left[0 , t\right])=N_{t}$
and
$N(\left(s , t\right])=N_{t}-N_{s}$.
Thus, the random counting measure is completely determined by the counting process. The counting process is the cumulative measure function

for the counting measure, analogous the cumulative distribution function of a probability measure.

Show that for $t\ge 0$ and $n\in \mathbb{N}$,

- $T_{n}\le t$ if and only if $N_{t}\ge n$. This event means that at there are at least $n$ arrivals in $\left[0 , t\right]$.
- $N_{t}=n$ if and only if $T_{n}\le t< T_{n+1}$. This event means that there are exactly $n$ arrivals in $\left[0 , t\right]$.

Prove the following results:

- $N_{t}< \infty $ for all $t\in \left[0 , \infty \right)$ if and only if $T_{n}\to \infty $ as $n\to \infty $. Thus conclude that this event has probability 1.
- $N_{t}\to \infty $ as $t\to \infty $ if and only if $T_{n}< \infty $ for all $n\in \mathbb{N}$. Thus conclude that this event has probability 1.

All of the exercises so far in this subsection show that the arrival time process $T$ and the counting process $N$ are inverses of one another in a sense. The important equivalences in Exercise 8 can be used to obtain the probability distribution of the counting variables in terms of the interarrival distribution function $F$.

Show that for $t\ge 0$ and $n\in \mathbb{N}$,

- $(N_{t}\ge n)=F_{n}(t)$
- $(N_{t}=n)=F_{n}(t)-F_{n+1}(t)$

The expected number of arrivals up to time $t$ is known as the renewal function:

$$m(t)=(N_{t})\text{,\hspace{1em}}t\ge 0$$The renewal function turns out to be of fundamental importance in the study of renewal processes. Indeed, the renewal function essentially characterizes the renewal process. It will take awhile to fully understand this, but as a first step, the expansion in terms of indicator variables leads to a nice connection between the renewal function and the interarrival distribution function.

Show that

$$m(t)=\sum_{n=1}^{\infty} F_{n}(t)\text{,\hspace{1em}}t\ge 0$$Note that we have not yet shown that $m(t)< \infty $, and note also that this does not follow from the previous exercise. However, we will establish this finiteness condition in the subsection on Moment Generating Functions below.

More generally, if $A$ is a (measurable) subset of $\left[0 , \infty \right)$, let $m(A)=(N(A))$, the expected number of arrivals in $A$.

Show that $m$ is a positive measure on the measurable subsets of $\left[0 , \infty \right)$; this measure is known as the renewal measure.

Show that

$$m(A)=\sum_{n=1}^{\infty} (T_{n}\in A)$$Show that if $s\le t$ then $m(t)-m(s)=m(\left(s , t\right])$, the expected number of arrivals in $\left(s , t\right]$

The last exercise implies that the renewal function actually determines the entire renewal measure. The renewal function is the cumulative measure function

, analogous to the cumulative distribution function of a probability measure. Thus, every renewal process naturally leads to two measures on
$\left[0 , \infty \right)$,
the random counting measure corresponding to the arrival times, and the measure associated with the expected number of arrivals.

For $t\in \left[0 , \infty \right)$, show that $T_{N_{t}}\le t< T_{N_{t}+1}$. That is, $t$ is in the random renewal interval $\left[T_{N_{t}} , T_{N_{t}+1}\right)$.

In the language of reliability, the random variable

$$C_{t}=t-T_{N_{t}}$$is called the current life at time $t$. This variable takes values in the interval $\left[0 , t\right]$ and is the age of the device that is in service at time $t$. The random process $C=\{C_{t}\colon t\ge 0\}$ is the current life process.

The random variable

$$R_{t}=T_{N_{t}+1}-t$$is called the remaining life at time $t$. This variable takes values in the interval $\left(0 , \infty \right)$ and is the time remaining until the device that is in service at time $t$ fails. The random process $R=\{R_{t}\colon t\ge 0\}$ is the remaining life process.

Finally, the random variable

$$L_{t}=C_{t}+R_{t}=T_{N_{t}+1}-T_{N_{t}}=X_{N_{t+1}}$$is called the total life at time $t$; this variable gives the total life of the device that is in service at time $t$. The random process $L=\{L_{t}\colon t\ge 0\}$ is the total life process.

Tail events of the current and remaining life can be written in terms of each other and in terms of the counting variables. Suppose that $t\in \left[0 , \infty \right)$, $x\in \left[0 , t\right]$, and $y\in \left[0 , \infty \right)$. Show that

- $\{R_{t}> y\}=\{N_{t+y}-N_{t}=0\}$
- $\{C_{t}\ge x\}=\{R_{t-x}> x\}=\{N_{t}-N_{t-x}=0\}$
- $\{C_{t}\ge x, R_{t}> y\}=\{R_{t-x}> x+y\}=\{N_{t+y}-N_{t-x}=0\}$

Of course, the various equivalent events in the last exercise must have the same probability. In particular, it follows that if we know the distribution of $R_{t}$ for all $t$ then we also know the distribution of $C_{t}$ for all $t$, and in fact we know the joint distribution of $R_{t}$ and $C_{t}$ for all $t$ and hence also the distribution of $L_{t}$ for all $t$.

The basic comparison in the following exercise is often useful, particularly for obtaining various bounds. The idea is very simple: if the interarrival times are shortened, the arrivals occur more frequently.

Suppose now that we have two interarrival sequences, $X=\left(\begin{array}{c}X_{1}\\ X_{2}\\ \end{array}\right)$ and $Y=\left(\begin{array}{c}Y_{1}\\ Y_{2}\\ \end{array}\right)$ defined on the same probability space, with $Y_{i}\le X_{i}$ (with probability 1) for each $i$. Show that

- $T_{Y, n}\le T_{X, n}$ for each $n$.
- $N_{Y, t}\ge N_{X, t}$ for each $t$.
- $m_{Y}(t)\ge m_{X}(t)$ for each $t$.

Suppose that $X=\left(\begin{array}{c}X_{1}\\ X_{2}\\ \end{array}\right)$ is a sequence of Bernoulli trials with success parameter $p\in \left(0 , 1\right)$. Recall that $X$ is a sequence of independent, identically distributed indicator variables with $p=(X=1)$. We have studied a number of random processes derived from $X$:

- $Y=\left(\begin{array}{c}Y_{0}\\ Y_{1}\\ \end{array}\right)$ where $Y_{n}$ the number of success in the first $n$ trials. The sequence $Y$ is the partial sum process associated with $X$. The variable $Y_{n}$ has the binomial distribution with parameters $n$ and $p$.
- $U=\left(\begin{array}{c}U_{1}\\ U_{2}\\ \end{array}\right)$ where $U_{n}$ the number of trials needed to go from success number $n-1$ to success number $n$. These are independent variables, each having the geometric distribution with parameter $p$.
- $V=\left(\begin{array}{c}V_{0}\\ V_{1}\\ \end{array}\right)$ where $V_{n}$ is the trial number of success $n$. The sequence $V$ is the partial sum process associated with $U$. The variable $V_{n}$ has the negative binomial distribution with parameters $n$ and $p$.

It is natural to view the successes as arrivals in a discrete-time renewal process.

Consider the renewal process with interarrival sequence $U$.

- Show that the basic assumptions are satisfied and that the mean interarrival time is $\mu =\frac{1}{p}$.
- Show that $V$ is the sequence of arrival times.
- Show that $Y$ is the counting process (restricted to $\mathbb{N}$).
- Show that the renewal function is $m(n)=np$ for $n\in \mathbb{N}$

It follows that the renewal measure is proportional to counting measure on $$.

Run the binomial timeline experiment 1000 times with an update frequency of 10 for various values of the parameters $n$ and $p$. Note the apparent convergence of the empirical distribution of the counting variable to the true distribution.

Run the negative binomial experiment 1000 times with an update frequency of 10 for various values of the parameters $k$ and $p$. Note the apparent convergence of the empirical distribution of the arrival time to the true distribution.

Consider again the renewal process with interarrival sequence $U$. For $n\in \mathbb{N}$, show that

- The current life and remaining life at time $n$ are independent.
- The remaining life at time $n$ has the same distribution as an interarrival time $U$, namely the geometric distribution with parameter $p$.
- The current life at time $n$ has a truncated geometric distribution with parameters $n$ and $p$: $$(C_{n}=k)=\begin{cases}p(1-p)^{k} & \text{if $k\in \{0, 1, , n-1\}$}\\ (1-p)^{n} & \text{if $k=n$}\end{cases}$$

This renewal process starts over, independently of the past, not only at the arrival times, but at fixed times $n\in \mathbb{N}$ as well. The Bernoulli trials process (with the successes as arrivals) is the only discrete-time renewal process with this property, which is a consequence of the memoryless property of the geometric interarrival distribution.

We can also use the indicator variables as the interarrival times. This may seem strange at first, but actually turns out to be useful.

Consider the renewal process with interarrival sequence $X$.

- Show that the basic assumptions are satisfied and that the mean interarrival time is $\mu =p$.
- Show that $Y$ is the sequence of arrival times.
- Show that the number of arrivals at time 0 is $U_{1}-1$ and the number of arrivals at time $i$ is $U_{i+1}$ for $i\in $.
- Show that the number of arrivals in the interval $\left[0 , n\right]$ is $V_{n+1}-1$ for $n\in \mathbb{N}$. This gives the counting process.
- Show that the renewal function is $m(n)=\frac{n+1}{p}-1$ for $n\in \mathbb{N}$

The age processes are not very interesting for this renewal process. Show that with probability 1,

- $C_{n}=0$
- $R_{n}=1$

As an application of the last renewal process, we can show that the moment generating function of the counting variable $N_{t}$ in an arbitrary renewal process is finite in an interval about 0 for any $t$. This implies that $N_{t}$ has finite moments of all orders and in particular that $m(t)< \infty $ for any $t$.

Suppose that $X=\left(\begin{array}{c}X_{1}\\ X_{2}\\ \end{array}\right)$ is the interarrival sequence for a renewal process. By the basic assumptions, there exists $a> 0$ such that

$$p=(X\ge a)> 0$$We now consider the renewal process with interarrival sequence $X_{a}=\left(\begin{array}{c}X_{a, 1}\\ X_{a, 2}\\ \end{array}\right)$, where

$$X_{a, i}=a(X_{i}\ge a)\text{,\hspace{1em}}i\in $$The renewal process with interarrival sequence $X_{a}$ is just like the renewal process in Exercise 22, except that the arrival times occur at the points in the sequence $\left(\begin{array}{c}0\\ a\\ 2a\\ \end{array}\right)$.

For each $t$, $N_{t}$ has finite moment generating function in an interval about 0.

- Show that $X_{a, i}\le X_{i}$ for each $i$.
- Compute the moment generating function of the geometric distribution with parameter $p$.
- Show that $(e^{\theta N_{a, t}})< \infty $ for $\theta < -\ln (1-p)$
- Conclude that $(e^{\theta N_{t}})< \infty $ for $\theta < -\ln (1-p)$

The Poisson process, named after Simeon Poisson, is the most important of all renewal processes. The Poisson process is so important that it is treated in a separate chapter in this project. Please review the essential properties of this process:

- The interarrival times have an exponential distribution with rate parameter $r$. Thus, the basic assumptions above are satisfied and the mean interarrival time is $\mu =\frac{1}{r}$.
- The exponential distribution is the only distribution with the memoryless property on $\left[0 , \infty \right)$.
- The time of the $(n)$ arrival $T_{n}$ has the gamma distribution with shape parameter $n$ and rate parameter $r$.
- The counting process. $N=\{N_{t}\colon t\ge 0\}$ has stationary, independent increments and $N_{t}$ has the Poisson distribution with parameter $rt$.
- In particular, the renewal function is $m(t)=rt$ for $t\in \left[0 , \infty \right)$. Hence, the renewal measure is a multiple of the standard length measure (Lebesgue measure) on $\left[0 , \infty \right)$.

Consider again the Poisson process with rate parameter $r$. For $t\in \left[0 , \infty \right)$, show that

- The current life and remaining life at time $t$ are independent.
- The remaining life at time $t$ has the same distribution as an interarrival time $X$, namely the exponential distribution with rate parameter $r$.
- The current life at time $t$ has a truncated exponential distribution with parameters $t$ and $r$: $$(C_{t}\ge s)=\begin{cases}e^{-(rs)} & \text{if $0\le s\le t$}\\ 0 & \text{if $s> t$}\end{cases}$$

The Poisson process starts over, independently of the past, not only at the arrival times, but at fixed times $t\in \left[0 , \infty \right)$ as well. The Poisson process is the only renewal process with this property, which is a consequence of the memoryless property of the exponential interarrival distribution.

Run the Poisson experiment 1000 times with an update frequency of 10 for various values of the parameters $t$ and $r$. Note the apparent convergence of the empirical distribution of the counting variable to the true distribution.

Run the gamma experiment 1000 times with an update frequency of 10 for various values of the parameters $k$ and $r$. Note the apparent convergence of the empirical distribution of the arrival time to the true distribution.