next up previous
Next: About this document ... Up: Poisson Process, Spike Train Previous: Poisson Process, Spike Train

Counting Process

Figure 1: Counting process
\begin{figure}\centering\epsfig{file=customers.eps, width=4in}\end{figure}
Let Fig. 1 be a graph of the customers who are entering a bank. Every time a customer comes, the counter is increased by one. The time of the arrival of $ i-$th customer is $ t_i$. Since the customers are coming at random, the sequence $ \left\{t_1, t_2, \cdots, t_m \right\}$, denoted shortly by $ \left\{t_i \right\}$, is a random sequence. Also, the number of customers who came in the interval $ (t_0, t]$ is a random variable (process). Such a process is right continuous, as indicated by the graph in Fig. 1.

As it is often case in the theory of stochastic processes, we assume that the index set, i.e. the set where $ \left\{t_i \right\}$ is taking values from, is $ T=[0,\infty)$. Therefore, we have a sequence of non-negative random variables

$\displaystyle 0\leq t_0<t_1<t_2<\cdots < t_m \rightarrow \infty$   as$\displaystyle \quad m\rightarrow \infty.
$

WLOG1 let $ t_0=0$ and $ N_0=0$, then

$\displaystyle N_t=\max\{n,   t_n\le t\}, \quad T=[0,\infty),
$

is called a point process (counting process), and is denoted shortly by $ \{N_t,  t\ge 0\}$.

Let $ T_n\triangleq t_n-t_{n-1}$ be inter-arrival time, then the sequence of inter-arrival times $ \{T_n,   n\ge 1\}$ is another stochastic process.

Special case is when $ \{T_n,   n\ge 1\}$ is a sequence of i.i.d.2 random variables, then the sequence $ \{t_n\}$ is called a renewal process. $ \{N_t,  t\ge 0\}$ is the associated renewal point process, sometimes also called renewal process. Also, keep in mind that $ t_n=T_1+T_2+\cdots+T_n
$.

Definition (Poisson process) A point process $ \{N_t,  t\ge 0\}$ is called a Poisson process if $ N_0=0$ and $ \{N_t\}$ satisfies the following conditions

  1. its increments are stationary and its non-overlapping increments are independent
  2. $ P(N_{t+\triangle t}-N_t=1)=\lambda \triangle t +o(\triangle t)$
  3. $ P(N_{t+\triangle t}-N_t\ge2)=o(\triangle t)$
Remarks

Theorem Let $ \{N_t,  t\ge 0\}$ be a Poisson process, then

$\displaystyle \boxed{P(N_t=k)=\frac{(\lambda\, t)^k}{k!}e^{-\lambda\, t}}\quad k=0,\,1,\, \cdots$ (1)

The expression on the left hand side of (1) represents the probability of $ k$ arrivals in the interval $ (0,  t]$.

Proof A generating function of a discrete random variable $ X$ is defined via the following z-transform (recall that the moment generating function of a continuous random variable is defined through Laplace transform):

$\displaystyle \mathbf{G}_X(z)=E[z^X]=\sum_{i=0}^{\infty}z^i p_i,
$

where $ p_i=P(X=i)$. Let us assume that $ X$ is a Poisson random variable with parameter $ \mu$, then

$\displaystyle P(X=i)=\frac{\mu^i}{i!}e^{-\mu}\qquad i=0, 1,  2,\cdots
$

and

$\displaystyle \mathbf{G}_X(z)=\sum_{i=0}^{\infty}z^i \frac{\mu^i}{i!}e^{-\mu}= e^{\mu(z-1)}.$ (2)

Going back to Poisson process, define the generating function as

$\displaystyle \mathbf{G}_t(z)\triangleq E[z^{N_t}]
$

Then we can write

\begin{displaymath}\begin{split}\mathbf{G}_{t+\triangle t}(z)&=E[z^{N_{t+\triang...
...iangle t))\,z^1 +o(\triangle t)(z^2+\cdots) \rbrack \end{split}\end{displaymath}    

Furthermore

\begin{displaymath}\begin{split}\frac{\mathbf{G}_{t+\triangle t}-\mathbf{G}_t(z)...
... &\Rightarrow \mathbf{G}_t(z)=e^{\lambda\,t\,(z-1)} \end{split}\end{displaymath}    

Comparing this result to (2) we conclude that $ N_t$ is a Poisson random variable with parameter $ \lambda  t$. $ \blacksquare$

Theorem If $ \{N_t,  t\ge 0\}$ is a Poisson process and $ T_n$ is the inter-arrival time between the $ n-$th and $ (n-1)-$th events, then $ \{T_n,   n\ge 1\}$ is a sequence of i.i.d. random variables with exponential distribution, with parameter $ \lambda$.

Proof

$\displaystyle P(T_1>t)=P(N_t=0)=e^{-\lambda  t} \quad \Rightarrow
\quad T_1 -$   exponential$\displaystyle $

Figure 2: Event description
\begin{figure}\centering\epsfig{file=intervals1.eps, width=3in}\end{figure}
Need to show that $ T_1$ and $ T_2$ are independent and $ T_2$ is also exponential.

$\displaystyle P(T_2>t \vert T_1\in(s-\delta, s+\delta])= \frac{P(T_2>t, T_1\in(s-\delta, s+\delta])}{P(T_1\in(s-\delta, s+\delta])}$ (3)

The event $ \{T_2>t, T_1\in(s-\delta, s+\delta]\}$ is a subset of the event described by Fig. 2, i.e.

$\displaystyle P(T_2>t, T_1\in(s-\delta, s+\delta])\le P(\underbrace{N_{s-\del...
...ce{N_{s+\delta}-
N_{s-\delta}=1}, \underbrace{N_{s+t-\delta}-N_{s+\delta}=0})
$

no arrivals      one arrival         no arrivals

\begin{displaymath}\begin{split}P(T_2>t,\,T_1\in(s-\delta,\,s+\delta]) &\le P(T_...
...(s-\delta,\,s+\delta])\,e^{-\lambda\,(t-2\,\delta)} \end{split}\end{displaymath}    

From (3) $ \Rightarrow$

$\displaystyle P(T_2>t \vert T_1\in(s-\delta, s+\delta]) \le e^{-\lambda (t-2 \delta)}$ (4)

Similarly, the event described by Fig. 3 is a subset of the event $ \{T_2>t, T_1\in(s-\delta, s+\delta]\}$, therefore

\begin{displaymath}\begin{split}P(N_{s-\delta}=0, \,N_{s+\delta}-N_{s-\delta}=1,...
...mbda\,t}&\le P(T_2>t,\,T_1\in(s-\delta,\,s+\delta]) \end{split}\end{displaymath}    

From (3) $ \Rightarrow$

$\displaystyle P(T_2>t \vert T_1\in(s-\delta, s+\delta]) \ge e^{-\lambda t}$ (5)

Figure 3: Event description
\begin{figure}\centering\epsfig{file=intervals2.eps, width=3in}\end{figure}
From (4) and (5), using squeeze theorem $ (\delta \rightarrow 0)$, it follows

$\displaystyle P(T_2>t \vert T_1=s)=e^{-\lambda t}\quad \Rightarrow \quad
f_{T_2 \vert T_1=s}(t \vert s)=\lambda e^{-\lambda t}
$

Therefore, $ T_2$ is independent of $ T_1$, and $ T_2$ is exponentially distributed random variable. $ \blacksquare$

Theorem

  1. $ E[N_t]=\lambda t$
  2. $ Var[N_t]=\lambda t$
Proof Recall that $ \mathbf{G}_t(z)=E[z^{N_t}]$, then

\begin{displaymath}\begin{split}&\left[\frac{d\mathbf{G}_t(z)}{dz}\right]_{z=1} ...
...a\,t\, e^{\lambda\,t(z-1)}\right]_{z=1}= \lambda\,t \end{split}\end{displaymath}    

Likewise

\begin{displaymath}\begin{split}&\left[\frac{d^2\mathbf{G}_t(z)}{dz^2}\right]_{z...
...mbda\,t-(\lambda\,t)^2=\lambda\,t \quad\blacksquare \end{split}\end{displaymath}    

Theorem (Conditioning on the number of arrivals) Given that in the interval $ (0,  T]$ the number of arrivals is $ N_T=n$, the $ n$ arrival times are independent and uniformly distributed on $ [0,  T]$.

Proof

Figure 4: Uniform bins
\begin{figure}\centering\epsfig{file=line.eps, width=2in}\end{figure}
Independence of arrival times $ t_1$, $ t_2$ etc. directly follows from independence of non-overlapping increments. In particular let $ t_1$ and $ t_2$ be arrival times of first and second event, then

\begin{displaymath}\begin{split}&P(t_1\in(0,\,s],\,t_2\in(s,\,t])=P(N_s=1,\,N_t-...
..._s=1\vert N_s=1)=P(t_1\in(0,\,s])\,P(t_2\in(s,\,t]) \end{split}\end{displaymath}    

Suppose that we know exactly one event happened in the interval $ (0,  T]$, and suppose the interval is partitioned into $ M$ segments of length $ \triangle t$, as shown in Fig. 1. Let $ p_i$ be the probability of event happening in the $ i-$th  bin, then $ \sum_{i=1}^{M}p_i=1$. From the definition of Poisson process it follows that $ p_i\propto \lambda \triangle t$, say $ p_i=C(\lambda \triangle t+ 
o(\triangle t))$. The constant $ C$ is determined from

$\displaystyle \sum_{i=1}^{M}C(\lambda \triangle t + o(\triangle t))=1\Righta...
...e t+M o(\triangle t)}=\frac{1} {T(\lambda+\frac{o(\triangle t)}{\triangle t})}$    

Let $ t_1$ be a random variable corresponding to the time of arrival, then the probability density function (pdf) of $ t_1$ can be defined as

$\displaystyle f_{t_1}(t)=\lim_{\triangle t\rightarrow 0}\frac{p_i}{\triangle t}= \frac{1}{T}\quad \forall i=1,2,\cdots,M$   where$\displaystyle \quad t=i \triangle t.$    

Therefore, $ t_1$ is uniformly distributed on $ [0,  T]$.
Let $ t_1$ and $ t_2$ be the arrival times of two events, and we know exactly two events happened on $ (0,  T]$. Also assume that $ t_1$ and $ t_2$ represent mere labels of events, not necessarily their order. Given that $ t_1$ happened in $ j-$th bin, the probability of $ t_2$ occurring in any bin of size $ \triangle t$ is proportional to the size of that bin, i.e. $ p_i\propto \lambda \triangle t$, except for the $ j-$th bin, where $ p_j\propto o(\triangle t)$. By rendering the bin size infinitesimal, we notice that the probability $ p_i$ remains constant over all but one bin, the bin in which $ t_1$ occurred, where $ p_j=0$. But this set is a set of measure zero, so the cumulative sum over $ p_i$ again gives rise to uniform distribution on $ (0,  T]$. $ \blacksquare$

Question What is the probability of observing $ n$ events at instances $ \tau_1,$ $ \tau_2,$ $ \cdots,$ $ \tau_n$ on the interval $ [0,  T]$?

Since arrival times $ t_1, t_2 \cdots, t_n$ are continuous random variables, the answer is 0. However, we can calculate the associated pdf as

\begin{displaymath}\begin{split}&f_{t_1\,t_2\,\cdots\,t_n}(\tau_1,\,\tau_2,\,\cd...
...s,\,\tau_n)= \frac{\lambda^n}{n!}\,e^{-\lambda\, T} \end{split}\end{displaymath}    

Question What is the power spectrum of Poisson process?

It does not make sense to talk about the power spectrum of Poisson process, since it is not a stationary process. In particular the mean of Poisson process is

$\displaystyle E[N_t]=\lambda t$    

and its autocorrelation function is

\begin{displaymath}\begin{split}R(t,s)&\triangleq E[N_t\,N_s]\\ R(t,s)&\overset{...
...lambda^2\,t^2+\lambda\,t=\lambda^2\,t\,s+\lambda\,t \end{split}\end{displaymath}    

Since $ R(t,s)\neq R(t-s)$, we conclude that $ \{N_t,  t\ge 0\}$ is not stationary (in weak sense), therefore it does not make sense to talk about its power spectrum. Let us define the following stochastic process (Fig. 5)

$\displaystyle S_t=\frac{dN_t}{dt}=\sum_{i}\delta(t-t_i)\quad -$spike train (6)

Figure 5: Spike train
\begin{figure}\centering\epsfig{file=spike.eps, width=2in}\end{figure}
The fundamental lemma says that if $ Y(t)=L\{X(t)\}$, where $ L$ is a linear operator, then

$\displaystyle E[Y(t)]=L\{E[X(t)]\}$    

Since differentiation is a linear operator we have

$\displaystyle E[S_t]=\frac{d(\lambda t)}{dt}=\lambda$    

Also, it can be shown using theory of linear operators that

\begin{displaymath}\begin{split}R_{SS}(t,s)&=\frac{\partial}{\partial t} \left[\...
...delta(t-s)\\ &\hspace{1in}\text{Heaviside function} \end{split}\end{displaymath}    

Thus, $ S_t$ is WWS3 stochastic process, and it makes sense to define the power spectrum of such a process as a Fourier transform of its autocorrelation function i.e.

$\displaystyle P_S(\omega)=\mathcal{F}\{R_{SS}(\tau)\}=\int_{-\infty}^{\infty}R_{SS}(\tau)  e^{-j \omega \tau}d\omega=\lambda+ \lambda^2 2\pi \delta(\omega)$    

Therefore, the spike train $ S_t=\sum_{i}\delta(t-t_i)$ of independent times $ t_i$ behaves almost as a white noise, since its power spectrum is flat for all frequencies, except for the spike at $ \omega=0$. The process $ S_t$ defined by (6) is a simple version of what is in engineering literature known as a shot noise.

Definition (Inhomogeneous Poisson process) A Poisson process with a non-constant rate $ \lambda=\lambda(t)$ is called inhomogeneous Poisson process. In this case we have

  1. non-overlapping increments are independent (the stationarity is lost though).
  2. $ P(N_{t+\triangle t}-N_t=1)=\lambda(t) \triangle t+o(\triangle t)$
  3. $ P(N_{t+\triangle t}-N_t\ge2)=o(\triangle t)$

Theorem If $ \{N_t, t>0\}$ is a Poisson process with the rate $ \lambda(t)$, then $ N_t$ is a Poisson random variable with parameter $ \mu=\int_0^t \lambda(\xi) d\xi$ i.e.

$\displaystyle \boxed{P(N_t=k)=\frac{(\int_0^t \lambda(\xi)\,d\xi)^k}{k!} \,e^{-\int_0^t \lambda(\xi)\,d\xi}}$ (7)

Proof The proof of this theorem is identical to that of homogeneous case except that $ \lambda$ is replaced by $ \lambda(t)$. In particular, one can easily get

$\displaystyle \mathbf{G}_t(z)=e^{(z-1)\int_0^t \lambda(\xi) d\xi },$ (8)

from which (7) readily follows. $ \blacksquare$

Theorem Let $ \{N_t, t>0\}$ be an inhomogeneous Poisson process with the rate $ \lambda(t)$ and let $ t>s\ge0$, then

$\displaystyle \boxed{P(N_t-N_s=k)=\frac{(\int_s^t \lambda(\xi)\,d\xi)^k}{k!} \,e^{-\int_s^t \lambda(\xi)\,d\xi}}$ (9)

The application of this theorem stems from the fact that we cannot use $ P(N_t-N_s=k)=P(N_{t-s}=k)$, since the increments are no longer stationary.

Proof

\begin{displaymath}
% latex2html id marker 1672
\begin{split}&\mathbf{G}_t(z)=E[...
...xi)\,d\xi }}= e^{(z-1)\int_s^t \lambda(\xi)\,d\xi } \end{split}\end{displaymath}    

Thus, $ N_t-N_s$ is a Poisson random variable with parameter $ \mu=\int_s^t \lambda(\xi) d\xi$, and (9) easily follows. $ \blacksquare$

Theorem

  1. $ E[N_t]=\int_0^t \lambda(\xi) d\xi$
  2. $ Var [N_t]=\int_0^t \lambda(\xi) d\xi$

Proof Recall that

$\displaystyle E[N_t]=\left[\frac{dG_z(t)}{dz}\right]_{z=1}$   and$\displaystyle \quad E[N_t^2]=\left[\frac{d^2G_z(t)}{dz^2}\right]_{z=1}+E[N_t]$    

From (8) we have $ \mathbf{G}_t(z)=e^{(z-1)\int_0^t \lambda(\xi) d\xi}$, and the two results follow after immediate calculations. $ \blacksquare$

Theorem (Conditioning on the number of arrivals) Given that in the interval $ (0,  T]$ the number of arrivals is $ N_T=n$, the $ n$ arrival times are independently distributed on $ [0,  T]$ with the pdf $ \lambda(t)/\int_0^T \lambda(\xi) d\xi$.

Proof The proof of this theorem is analogous to that of the homogeneous case. The probability of a single event happening at any of $ M$ bins (Fig. 1) is given by $ p_i=C(\lambda(i \triangle t) \triangle t+
o(\triangle t))$, where $ i$ is the bin index. Given that exactly one event occurred in the interval $ (0,  T]$, we have

\begin{displaymath}\begin{split}\sum_{i=1}^M p_i&=1\,\Rightarrow\, C=\frac{1}{\s...
...)\,d\xi}\quad \text{where} \quad t=i\, \triangle t. \end{split}\end{displaymath}    

The argument for independence of two or more arrival times is identical to that of the homogeneous case. $ \blacksquare$

Question What is the probability of observing $ n$ events at instances $ \tau_1,$ $ \tau_2,$ $ \cdots,$ $ \tau_n$ on the interval $ [0,  T]$?

Since arrival times $ t_1, t_2 \cdots, t_n$ are continuous random variables, the answer is 0. However, we can calculate the associated pdf as

\begin{displaymath}\begin{split}&f_{t_1\,t_2\,\cdots\,t_n}(\tau_1,\,\tau_2,\,\cd...
...mbda(\tau_i)}{n!} \,e^{-\int_0^T\lambda(\xi)\,d\xi} \end{split}\end{displaymath}    


Question How to generate a sample path of a point (Poisson) process on $ [0,  T]$? It can be done in many different ways. For a homogeneous process we use three methods:

  1. Conditioning on the number of arrivals. Draw the number of arrivals $ n$ from a Poisson distribution with parameter $ \lambda\,T$, and then draw $ n$ numbers uniformly distributed on $ [0,  T]$.
  2. Method of infinitesimal increments. Partition the segment $ [0,  T]$ into sufficiently small subsegments of length $ \triangle t$ (say $ \lambda\, \triangle t \ll 1$). For each subsegment draw a number $ s$ from a uniform distribution on $ [0,\,1]$. If $ s\le\lambda\,\triangle t\,e^{-\lambda\,\triangle t}$, an event occurres in that subsegment. We repeat the procedure for all subsegments. We can use the approximation $ s\le \lambda\,\triangle t$ for a less expensive procedure.
  3. Method of interarrival times. Using the fact that inter-arrival times are independent and exponentially distributed, we draw $ n$ random variables $ \{T_i\}_1^n$ from exponential distribution with parameter $ \lambda$, where $ n$ is the smallest integer that satisfies the criterion $ \sum_{i=1}^n T_i>T$. Then the sequence $ \{T_i\}_1^{n-1}$ generates the required point process.
For an inhomogeneous process, the procedures 1 and 2 can be modified using the appropriate theory of inhomegeneous Poisson process.

The illustration of the three procedures for a homogeneous case is given below. We use $ T=1 \,s$ and $ \lambda=30\, events/s$. For each different procedure we generate $ 1000$ sample paths, and calculate the statistics by averaging over ensemble. Raster plots of $ 10$ out of $ 1000$ samples for the three methods are shown in Fig. 6, Fig. 7 and Fig. 8, respectively. Since the generated processes are homogeneous, the inter-arrival time distribution is exponential with parameter $ \lambda$. The histograms of inter-arrival times corresponding to the three methods are also given below. They clearly exhibit an exponenital trend. The sample mean and sample standard deviation (unbiased) are also shown. Keep in mind that for exponentially distributed random variables, both mean and standard deviation are given by $ 1/\lambda$. Sample statistics indicates that $ \lambda$ is in the vicinity of $ 30$.

Figure 6: Realization of a point process using conditioning on the number of arrivals. (Top) Ten different sample paths of the same point process shown as raster plots. (Bottom) The histogram of inter-arrival times, showing the exponential trend
\begin{figure}\centering\epsfig{file=rast_homo_1.eps, width=5in} [0.1in]
\epsfig{file=hist_homo_1.eps, width=5.70in}\end{figure}

Figure 7: Realization of a point process using method of infinitesimal increments. (Top) Ten different sample paths of the same point process shown as raster plots. (Bottom) The histogram of inter-arrival times, showing the exponential trend
\begin{figure}\centering\epsfig{file=rast_homo_2.eps, width=5in} [0.1in]
\epsfig{file=hist_homo_2.eps, width=5.70in}\end{figure}

Figure 8: Realization of a point process using method of independent inter-arrival times. (Top) Ten different sample paths of the same point process shown as raster plots. (Bottom) The histogram of inter-arrival times, showing the exponential trend
\begin{figure}\centering\epsfig{file=rast_homo_3.eps, width=5in} [0.1in]
\epsfig{file=hist_homo_3.eps, width=5.7in}\end{figure}


next up previous
Next: About this document ... Up: Poisson Process, Spike Train Previous: Poisson Process, Spike Train
Zoran Nenadic 2002-12-13