Poisson Processes

Many scenarios are natural continuous-time discrete-state processes. Consider modeling the number of people waiting for the bus. People can arrive at the stop at any moment, but the number of people is certainly discrete. Sometimes a bus comes and picks everyone up.

We could discretize this process, in which case we must handle transitions for however many people arrive in the interval. We’d like to define how things interact as .

This should sound a lot like a Poisson process, which can be viewed as a counting process, at a rate . In this case, the only relevant information is the time of the jumps

We also look at

Given the above definitions, we can write some useful equivalencies

A counting process can be defined by sampling from any distribution. However, it will only be a Markov process when we sample from the exponential distribution. The reason this isn’t true for any distribution is because the probability cannot depend on how long we’ve waited, even within an interval.

Definition: is a Markov process , the distribution of given only depends on . In other words, the future depends on the present, not the past.

A process is time-homogeneous if

Definition: is memoryless if

Theorem: This is a very strong condition. As we said above, the only distribution satisfying the memoryless property is the exponential distribution.

Assume is memoryless and let
We use the condition from above,

Differentiating on with yields

Now it’s pretty clear that the only possible distributions satisfying this differential equation are exponential

Definition: The Poisson process is the counting process with intervals independent and exponential with

Definition: has stationary increments if has the same distribution as for all
This implies time-homogeneity

Definition: has independent increments if is independent of for all disjoint intervals
This implies the Markov property

Definition: A Levy process is a process with independent, stationary increments. These are all Markov processes.

Theorem: The following statements are equivalent,

  1. is a Poisson process of rate
  2. has stationary, independent increments (Levy process) and follows the Poisson distribution
  3. is a Levy process, , , and . This says the probability of more than one jump approaches for small increments.

Here, means “negligible compared to ”, a function of such that . We derive the probabilities listed by taking a look at the Taylor series.

Theorem: If and are independent Poisson processes with rates and then is a Poisson process with rate

Theorem: One can “split” a process into independent processes with with and with by assigning jumps to with probability

Defining a Continuous Process

An important fact to keep in mind is that the minimum between exponential variables with rates is an exponential variable with rate

The probability of the th variable “triggering” is

To recap, we need a process with,

  • Markov property:
  • Time-homogeneity:

We assign for each transition , denoting the rate of an exponential random variable

represents the rate at which the chain transitions from state


Chapman-Kolmogorov:
We derive differential equations by taking the limits and



We take the limit as approaches to derive (this version is easier to understand)
We take the limit as approaches to derive (this version is harder to understand but equivalent)

We may also write it with

Our initial condition is

In accordance with our equation, we define the infinitesimal generator as the matrix whose entry is if and if . Note that row sums equal , nondiagonal entries are nonnegative, and diagonal entries are nonpositive.


Alternatively, let
,

Example: and

, ,

so

Note that the right eigenvector of which is all s corresponds to the stationary distribution, the left eigenvector in

In other words, we are looking for a limiting probability

Suppose is irreducible, then there is a unique probability vector satisfying , and all other eigenvalues of have negative real part

If the chain is reducible, we analyze each communication class
Periodicity does not occur

We can analyze the chain similar to a discrete state chain. Compute the mean passage time to a fixed state ,

Let be the first time that the chain changes state,
This simplifies to

Say is the matrix obtained by deleting the row and column associated with , then we get . This is an invertible matrix.

We can also view the invariant measure in terms of the embedded Markov chain. If the embedded Markov chain has invariant measure , then the associated continuous time process has invariant measure

A continuous process is reversible