This page contains material on stochastic Petri nets and chemical reaction networks written by Brendan Fong. It may go into one or more blog posts. To discuss this page as it is being written, go to the Azimuth Forum.
THIS IS A BIT OF A MESS. WILL BE NEATER SOON.
If you take your favourite stochastic Petri net, you probably now know how to construct two different equations from it: the rate equation and the master equation. In case you’ve forgotten, take a quick look at John’s review in Part 8.
The rate equation tells us how the expected number of things of each species changes with time and looks like:
The master equation tells us how the probability that we have a given number of things of each species changes with time and looks like:
So the rate equation describes the time evolution of an average value of the state of the system, while the master equation describes the time evolution of a probability distribution of pure states. You might guess then that this means that by taking the ‘expected value’ of the master equation, we get the rate equation. I’d like to tell you this is true, but really it’s only approximately true: it’s only true if we are careful with what we mean by expected value.
Indeed, expecting that the rate equation will arise if we just naively take expected values of the master equation is too much to expect of the rate equation in general. The rate equation only contains information about the evolution of the expected value, while the master equation tells us how the whole probability distribution will evolve. It’s not hard to think of two distributions that have the same expected value, but will evolve to have different expected values if we just let the master equation run.
EXAMPLE
But, interestingly, it is exactly true if the state of the system takes the form of a product of Poisson distributions.
Let’s see how this goes. Remember that given a state , its expected value is . This means we take the sum of each pure state vector weighted by the probability our system is in that pure state. We’ll pretend for the moment that this sum always converges. Then
To simplify this and make it look a bit more like the rate equation, we’ll write this as
Remember, the rate equation looks like this:
So, remembering that we think of as the expected value of our mixed state, we can see that the rate equation and the master equation get along perfectly when
for each reaction . This is not always true, but it is always true when takes the form of the product of independent Poisson distributions with parameter . That is, when has the form
(Don’t forget we’re using an index free notation here, so and !)
We can show this quite simply:
From this we can see the Poisson distribution is special in the relationship between the master and rate equations: the rate equation tells us about the time evolution of a Poisson distribution. This means that if we know the (stochastic) state of our system takes the form of a product of Poisson distributions, then to understand how it changes over time, all we need to do is look at how its expected value changes according to the rate equation. The master equation gives no extra information here!
But our system is not always in a Poisson-form state, and I’ve promised that nonetheless we get the rate equation from taking an expected value of the master equation. Instead, the master equation becomes the rate equation when we talk in concentrations and make the system arbitrarily large. ie. we duplicate our system lots and average over them. That is, when we make the system arbitrarily large, the stochastic nature of the Petri net becomes insignificant, and so we can talk in deterministic terms! This is essentially the law of large numbers.
We first need to think a bit about how the master equation arises.
Remember that each transition contributes a term to the master equation consisting of the product of a rate constant and a falling power. In particular, the falling power counts the number of ways our elements of each species can form the input complex for the transition. This assumes that our system is perfectly mixed.
But this isn’t true when our system gets really big! So we’re going to count things a bit differently. When our system gets big, we want our species to be near enough to each other for the transition to occur. A wolf can’t eat a rabbit if they are on opposite sides of the forest. So let’s say there are ‘zones’. (Maybe do this within an example).
So in this case, our rate equation is actually talking about how the expected number of each species per ‘zone’ evolves.
With this understanding, let’s see what happens when we take our master equation, but duplicate our system over and over until we have many, many zones.
In the chemical reaction network literature, people call this the classical scaling.
For what’s to come, it’ll be good to write the master equation using number operators. For each species , the number operator is the product of the creation and annihilation operators for that species:
Obviously then we have
for any natural number . But more interestingly, we have
where the underline here indicates a falling power:
Puzzle 2: Prove equation (1). If you get stuck, reread Part 6.
Given this, it makes sense to define
for any vector , since then we have
because annihilation and creation operators for different species always commute.
This lets us write the master equation a different way. We have
so the master equation, (eq:master), is equivalent to