The strong renewal assumption means that the Poisson process must probabilistically restart at a fixed time $s$. In particular, if the first arrival has not occurred by time $s$, then the time remaining until the arrival occurs must have the same distribution as the first arrival time itself. This is known as the memoryless property and can be stated in terms of a generic interarrival time $X$ as follows: the conditional distribution of $X-s$ given $X> s$ is the same as the distribution of $X$. Equivalently,

$$(X> t+s, X> s)=(X> t)\text{,\hspace{1em}}s\ge 0\text{,\hspace{0.5em}}t\ge 0$$Let $G$ denote the right-tail distribution function of $X$, so that $G(t)=(X> t)$ for $t\ge 0$.

Show that the memoryless property is equivalent to the law of exponents:

$$G(t+s)=G(s)G(t)\text{,\hspace{1em}}s\ge 0\text{,\hspace{0.5em}}t\ge 0$$Show that the only solutions of the functional equation in Exercise 1, which are continuous from the right, are exponential functions. Let $c=G(1)$. Successively show that

- $G(n)=c^{n}$ for $n\in \mathbb{N}$.
- $G(\frac{1}{n})=c^{\left(\frac{1}{n}\right)}$ for $n\in $.
- $G(\frac{m}{n})=c^{\left(\frac{m}{n}\right)}$ for $m\in \mathbb{N}$ and $n\in $.
- $G(t)=c^{t}$ for $t\ge 0$.

In the context of Exercise 2, let $r=-\ln c$. Then $r> 0$ (since $0< c< 1$) so

$$G(t)=(X> t)=e^{-(rt)}\text{,\hspace{1em}}t\ge 0$$Hence $X$ has a continuous distribution with cumulative distribution function given by

$$F(t)=(X\le t)=1-e^{-(rt)}\text{,\hspace{1em}}t\ge 0$$Show that the probability density function of $X$ is

$$f(t)=re^{-(rt)}\text{,\hspace{1em}}t\ge 0$$A random variable with this probability density function is said to have the exponential distribution with rate parameter $r$. The reciprocal $\frac{1}{r}$ is known as the scale parameter.

Show directly that the exponential probability density function is a valid probability density function.

In the gamma experiment, set $k=1$ so that the simulated random variable has an exponential distribution. Vary $r$ with the scroll bar and watch how the shape of the probability density function changes. For selected values of $r$, run the experiment 1000 times with an update frequency of 10, and watch the apparent convergence of the empirical density function to the probability density function.

Show that the quantile function of $X$ is

$$F^{(-1)}(p)=\frac{-\ln (1-p)}{r}\text{,\hspace{1em}}0< p< 1$$In particular, show that

- The median of $X$ is $\frac{\ln 2}{r}$
- The first quartile of $X$ is $\frac{\ln 4-\ln 3}{r}$
- The third quartile $X$ is $\frac{\ln 4}{r}$
- The interquartile range is $\frac{\ln 3}{r}$

Recall that if a nonnegative random variable with a continuous distribution is interpreted as the lifetime of a device, then the failure rate function is

$$h(t)=\frac{f(t)}{1-F(t)}\text{,\hspace{1em}}t\ge 0$$where, as usual, $f$ denotes the probability density function and $F$ the cumulative distribution function.

Show that the exponential distribution with rate parameter $r$ has constant failure rate $r$, and is the only such distribution.

The following exercises give the mean, variance, and moment generating function of the exponential distribution.

Show that $(X)=\frac{1}{r}$.

Show that $\sigma(X)^2=\frac{1}{r^{2}}$.

Show that $(e^{uX})=\frac{r}{r-u}$ for $u< r$.

In the context of the Poisson process, the parameter $r$ is known as the rate of the process. On average, there are $\frac{1}{r}$ time units between arrivals, so the arrivals come at an average rate of $r$ per unit time.

Note also that the mean and standard deviation are equal for an exponential distribution, and that the median is always smaller than the mean.

In the gamma experiment, set $k=1$ so that the simulated random variable has an exponential distribution. Vary $r$ with the scroll bar and watch how the mean/standard deviation bar changes. Now set $r=0.5$, run the experiment 1000 times with an update frequency of 10, and watch the apparent convergence of the empirical mean and standard deviation to the distribution mean and standard deviation, respectively.

Show that $(X^{n})=\frac{(n+1)}{r^{n}}$ for $n> 0$ where $$ is the gamma function. In particular, $(X^{n})=\frac{n!}{r^{n}}$ if $n\in \mathbb{N}$.

The exponential distribution has an amazing number of interesting mathematical properties; some of these properties are satisfied only by the exponential distribution, and thus serve as characterizations.

Suppose that $X$ has the exponential distribution with rate parameter $r$. Show that the following random variables have geometric distributions on $\mathbb{N}$ and on $$, respectively, each with parameter $1-e^{-r}$.

- $\lfloor X\rfloor $, the largest integer less than or equal to $X$. .
- $\lceil X\rceil $, the smallest integer greater than or equal to $X$.

In many respects, the geometric distribution is a discrete version of the exponential distribution.

Suppose that $X$ and $Y$ have exponential distributions with parameters $a$ and $b$, respectively, and are independent. Show that

$$(X< Y)=\frac{a}{a+b}$$Suppose that $\left(\begin{array}{c}X_{1}\\ X_{2}\\ \\ X_{n}\end{array}\right)$ is a sequence of independent random variables, and that $X_{i}$ has the exponential distribution with rate parameter $r_{i}> 0$ for each $i\in \{1, 2, , n\}$.

- Find the distribution function and density function of $U=\min\{X_{1} , X_{2} , , X_{n}\}$.
- Find the distribution function of $V=\max\{X_{1} , X_{2} , , X_{n}\}$.
- Find the density function of $V$ in the special case that $r_{i}=r$ for each $i\in \{1, 2, , n\}$

Note that the minimum $U$ in part (a) has the exponential distribution with parameter $r_{1}+r_{2}++r_{n}$. In the context of reliability, if a series system has independent components, each with an exponentially distributed lifetime, then the lifetime of the system is also exponentially distributed, and the failure rate of the system is the sum of the component failure rates. In the context of random processes, if we have $n$ independent Poisson process, then the new process obtained by combining the random times is also Poisson, and the rate of the new process is the sum of the rates of the individual processes (we will return to this point latter).

In the order statistic experiment, select the exponential distribution.

- Set $k=1$ (this gives the minimum $U$). Vary $n$ with the scroll bar and note the shape of the density function. For selected values of $n$, run the simulation 1000 times, updating every 10 runs. Note the apparent convergence of the empirical density function to the true density function.
- Vary $n$ with the scroll bar, set $k=n$ each time (this gives the maximum $V$), and note the shape of the density function. For selected values of $n$, run the simulation 1000 times, updating every 10 runs. Note the apparent convergence of the empirical density function to the true density function.

We can now generalize Exercise 15:

In the setting of Exercise 16, show that for $i\in \{1, 2, , n\}$,

$$(\forall j\neq i\colon X_{i}< X_{j})=\frac{r_{i}}{\sum_{j=1}^{n} r_{j}}$$- First, note that $X_{i}< X_{j}$ for all $j\neq i$ if and only if $X_{i}< \min\{j\neq i\mid j\neq i\}$.
- Note that he minimum on the right is independent of $X_{i}$ and, by Exercise 16, has an exponential distribution with parameter $\sum_{j\neq i} r_{j}$
- Now use Exercise 15.

The results in Exercise 16 and Exercise 18 are very important in the theory of continuous-time Markov chains. Suppose that
$X_{i}$
is the time until an event of interest occurs (the arrival of a customer, the failure of a device, etc.) for each
$i$; these times are independent and exponentially distributed. Then the *first* time
$U$ that one of the events occurs is also exponentially distributed (Exercise 16 (a)), and the probability that the first event to occur is event
$i$ is proportional to the rate
$\n\t\t\n\t\t\tr\n\t\t\ti\n\t\t\n\t$.
The next exercise gives a randomized

version of the memoryless property:

Suppose that $X$ and $Y$ are independent and that $Y$ has the exponential distribution with rate parameter $r> 0$, Show that $X$ and $Y-X$ are conditionally independent given $X< Y$, and the conditional distribution of $Y-X$ is also exponential with parameter $r$.

Consider again the setting of Exercise 16. Show that

$$(X_{1}< X_{2}< < X_{n})=\prod_{i=1}^{n} \frac{r_{i}}{\sum_{j=i}^{n} r_{j}}$$Of course, the probabilities of other orderings can be computed by permuting the parameters appropriately in the formula on the right.

Suppose that the length of a telephone call (in minutes) is exponentially distributed with rate parameter $r=0.2$.

- Find the probability that the call lasts between 2 and 7 minutes.
- Find the median, the first and third quartiles, and the interquartile range of the call length.

Suppose that the lifetime of a certain electronic component (in hours) is exponentially distributed with rate parameter $r=0.001$.

- Find the probability that the component lasts at least 2000 hours.
- Find the median, the first and third quartiles, and the interquartile range of the lifetime.

Suppose that the time between requests to a web server (in seconds) is exponentially distributed with rate parameter 2.

- Give the mean and standard deviation of the time between requests.
- Find the probability that the time between requests is less that 0.5 seconds.
- Find the median, the first and third quartiles, and the interquartile range of the time between requests.

Suppose that the lifetime $X$ of a fuse (in 100 hour units) is exponentially distributed with $(X> 10)=0.8$.

- Find the rate parameter.
- Find the mean and standard deviation.
- Find the median, the first and third quartiles, and the interquartile range of the lifetime.

The position $X$ of the first defect on a digital tape (in cm) has the exponential distribution with mean 100.

- Find the rate parameter.
- Find the probability that $X< 200$ given $X> 150$ .
- Find the standard deviation.
- Find the median, the first and third quartiles, and the interquartile range of the position.