Blog - network theory (Fong guest posts)

This page contains material on stochastic Petri nets and chemical reaction networks written by Brendan Fong. It may go into one or more blog posts. To discuss this page as it is being written, go to the Azimuth Forum.

THIS IS A BIT OF A MESS. WILL BE NEATER SOON.

If you take your favourite stochastic Petri net, you probably now know how to construct two different equations from it: the rate equation and the master equation. In case you’ve forgotten, take a quick look at John’s review in Part 8.

The rate equation tells us how the expected number of things of each species changes with time and looks like:

The master equation tells us how the probability that we have a given number of things of each species changes with time and looks like:

So the rate equation describes the time evolution of an average value of the state of the system, while the master equation describes the time evolution of a probability distribution of pure states. You might guess then that this means that by taking the ‘expected value’ of the master equation, we get the rate equation. I’d like to tell you this is true, but really it’s only approximately true: it’s only true if we are careful with what we mean by expected value.

Indeed, expecting that the rate equation will arise if we just naively take expected values of the master equation is too much to expect of the rate equation in general. The rate equation only contains information about the evolution of the expected value, while the master equation tells us how the whole probability distribution will evolve. It’s not hard to think of two distributions that have the same expected value, but will evolve to have different expected values if we just let the master equation run.

EXAMPLE

But, interestingly, it is exactly true if the state of the system takes the form of a product of Poisson distributions.

Let’s see how this goes. Remember that given a state $\Psi = \sum_{n \in \mathbb{N}^k} \psi_n z^n$, its expected value is $E(\Psi) = \sum_{n \in \mathbb{N}^k} n\psi_n \in [0,\infty)^k$. This means we take the sum of each pure state vector weighted by the probability our system is in that pure state. We’ll pretend for the moment that this sum always converges. Then

$\frac{d}{d t} E(\Psi) = \sum_{n \in \mathbb{N}^k} \sum_{\tau \in T} r(\tau) ((n+n(\tau)-m(\tau))-n)\frac{n!}{(n-m(\tau))!} \psi_n.$

To simplify this and make it look a bit more like the rate equation, we’ll write this as

$\frac{d}{d t} E(\Psi) = \sum_{\tau \in T} r(\tau) \left(\sum_{n \in \mathbb{N}^k}\frac{n!}{(n-m(\tau))!} \right)\psi_n(n(\tau)-m(\tau)).$

Remember, the rate equation looks like this:

$\frac{d x}{d t} = \sum$

So, remembering that we think of $x$ as the expected value $E(\Psi)$ of our mixed state, we can see that the rate equation and the master equation get along perfectly when

$\sum_{n \in \mathbb{N}^k}\frac{k!}{(k-m(\tau))!} \psi_n = (E(\Psi))^{m(\tau)}$

for each reaction $\tau$. This is not always true, but it *is* always true when $\Psi$ takes the form of the product of independent Poisson distributions with parameter $x = E(\Psi) \in [0, \infty)^k$. That is, when $\Psi$ has the form

$\Psi = e^{-x}\sum_{n \in \mathbb{N}^k} \frac{x^n}{n!}z^n.$

(Don’t forget we’re using an index free notation here, so $e^{-x} = \prod_{i = 1}^k e^{-x_i}$ and $x^n = \prod_{i=1}^k x_i^n$!)

We can show this quite simply:

$\sum_{n \in \mathbb{N}^k}\frac{n!}{(n-m(\tau))!} \psi_n = \sum_{n \in \mathbb{N}^k}\frac{n!}{(n-m(\tau))!} \frac{x^n}{n!}e^{-x} = x^{m(\tau)}e^{-x}\sum_{n \in \mathbb{N}^k} \frac{x^{n-m(\tau)}}{(n-m(\tau))!} = x^{m(\tau)} = (E(\Psi))^{m(\tau)}.$

From this we can see the Poisson distribution is special in the relationship between the master and rate equations: the rate equation tells us about the time evolution of a Poisson distribution. This means that if we know the (stochastic) state of our system takes the form of a product of Poisson distributions, then to understand how it changes over time, all we need to do is look at how its expected value changes according to the rate equation. The master equation gives no extra information here!

But our system is not always in a Poisson-form state, and I’ve promised that nonetheless we get the rate equation from taking an expected value of the master equation. Instead, the master equation becomes the rate equation when we talk in concentrations and make the system arbitrarily large. ie. we duplicate our system lots and average over them. That is, when we make the system arbitrarily large, the stochastic nature of the Petri net becomes insignificant, and so we can talk in deterministic terms! This is essentially the law of large numbers.

We first need to think a bit about how the master equation arises.

Remember that each transition contributes a term to the master equation consisting of the product of a rate constant and a falling power. In particular, the falling power counts the number of ways our elements of each species can form the input complex for the transition. This assumes that our system is perfectly mixed.

But this isn’t true when our system gets really big! So we’re going to count things a bit differently. When our system gets big, we want our species to be near enough to each other for the transition to occur. A wolf can’t eat a rabbit if they are on opposite sides of the forest. So let’s say there are $N$ ‘zones’. (Maybe do this within an example).

So in this case, our rate equation is actually talking about how the expected number of each species per ‘zone’ evolves.

With this understanding, let’s see what happens when we take our master equation, but duplicate our system over and over until we have many, many zones.

In the chemical reaction network literature, people call this the classical scaling.

For what’s to come, it’ll be good to write the master equation using number operators. For each species $i$, the **number operator** $N_i$ is the product of the creation and annihilation operators for that species:

$N_i = a_i^\dagger a_i$

Obviously then we have

$(a_i^\dagger a_i)^p = N_i^p$

for any natural number $p$. But more interestingly, we have

(1)${a_i^\dagger}^p a_i^p = {N_i}^{\underline{p}}$

where the underline here indicates a **falling power**:

${N_i}^{\underline{p}} = N_i (N_i - 1) (N_i - 2) \cdots (N_i - p + 1)$

**Puzzle 2**: Prove equation (1). If you get stuck, reread Part 6.

Given this, it makes sense to define

$N^{\underline n} = N_1^{\underline{n_1}} \cdots N_k^{\underline{n_k}}$

for any vector $n \in \mathbb{N}^k$, since then we have

${a^\dagger}^n a^n = N^{\underline n}$

because annihilation and creation operators for *different* species always commute.

This lets us write the master equation a different way. We have

$({a^\dagger}^{n(\tau)} - {a^\dagger}^{m(\tau)} ) \, a^{m(\tau)} =
{a^\dagger}^{n(\tau)} a^{m(\tau)} - N^{\underline{m(\tau)}}$

so the master equation, (eq:master), is equivalent to

(2)$\frac{d}{d t} \Psi(t) = \sum_{\tau \in T} r(\tau) \, \left({a^\dagger}^{n(\tau)}a^{m(\tau)} -N^{\underline{m(\tau)}}\right) \, \Psi(t)$

category: blog