Showing changes from revision #13 to #14:
Added | Removed | Changed
This page contains material on stochastic Petri nets and chemical reaction networks written by Brendan Fong. It may go into one or more blog posts. To discuss this page as it is being written, go to the Azimuth Forum.
The master equation describes the time evolution of a probability distribution of pure states, while the rate equation describes the time evolution of an average value of the state of the system. You might think that this means that by taking the ‘expected value’ of the master equation, we get the rate equation. As it turns out, that’s only approximately true. But it’s exactly true if the state of the system takes the form of a product of Poisson distributions!
Let’s see how this goes. Remember that given a state , its expected value is . This means we take the sum of each pure state vector weighted by the probability our system is in that pure state. We’ll pretend for the moment that this sum always converges. Then
To simplify this and make it look a bit more like the rate equation, we’ll write this as
Then, remembering that we think of as the expected value of our mixed state, we can see that the rate equation and the master equation get along perfectly when
for each reaction . This is not always true, but it is always true when takes the form of the product of independent Poisson distributions with parameter . That is, when has the form
(Don’t forget we’re using an index free notation here, so and !)
We can show this quite simply:
From this we can see the Poisson distribution is special in the relationship between the master and rate equations.
For what’s to come, it’ll be good to write the master equation using number operators. For each species , the number operator is the product of the creation and annihilation operators for that species:
Obviously then we have
for any natural number . But more interestingly, we have
where the underline here indicates a falling power:
Puzzle 2: Prove equation (1). If you get stuck, reread Part 6.
Given this, it makes sense to define
for any vector , since then we have
because annihilation and creation operators for different species always commute.
This lets us write the master equation a different way. We have
so the master equation, (eq:master), is equivalent to
Motivation: is a Hamiltonian for a (finite) stochastic Petri net.
Observable: diagonal operator.
Write .
Let be a countable set. an operator with ‘only finitely many nonzero entries in each column’ and such that is stochastic for all ( is infinitesmial stochastic for all ) – that is, is a Hamiltonian for a finite stochastic Petri net.
Let be a ‘diagonal’ operator.
Then
if and only if
for all evolutions of states .
We first prove that if then and . In fact this is true for all powers of our observable . As our integral and do not depend on , we may pass the differentiation operator through them to see
Now since ,
But by our hypothesis, and commute! So
Since for all characteristic functions , and since the integral is linear, for all , and we thus arrive at the desired result:
The backward direction is a bit trickier. We now assume that the rate of change of the integral of our observable and its square are zero, and wish to show that this implies that our observable and Hamiltonian commute. More explicitly, we observe that our hypotheses are that
and
For elements , of , we write . Thinking of and as states of our system, and as a Hamiltonian, this represents the rate at which the state contributes to the state . For our observable , we further abbreviate each as , since for these quantities are zero anyway.
Observe that
Since we want to show that this is zero for each pair of elements of , it suffices to show that when , then . That is, we need to show that if the system can move from state to state , then the observable takes the same value on these two states.
Fix the element , and consider the sum
Since is fixed, only finitely many of the are nonzero, and so this sum always exists. Expanding this, we have
But this sum splits as
and these three terms are each zero: the first because is infinitesmial stochastic, and the latter two by our hypotheses. Thus we have that
Now when , , and thus . But when , and are non-negative. Since they sum to zero, they must each be individually zero. Thus for all , we have
This shows that either or , which is what we wanted to show.
(Fact: )
Fragmentary stuff:
And in fact, we can think of the total number of things as an operator on our space of formal power series:
where and are the number operators we’ve seen so often before:
What exactly is the relation between and the total number of things? Part of the answer is this: the expected total number of things in any state is given by
Let’s see why!
First of all, what does this expression even mean? The funny ‘sum’ notation here was introduced in Part 6, but now I’m using it for power series in two variables instead of one. Namely, for any power series
we define
Thus, whenever is a state, since probabilities must sum to 1. But we also have
so that
And this is exactly the expected value of the total number of things.
So, for this and other reasons, we can think of the operator an ‘observable’ that counts the total number of things. Now, in quantum mechanics Noether’s theorem tells us that an observable is a conserved quantity—it doesn’t change with time—if it commutes with the Hamiltonian. So you should suspect that
where the commutator is defined to be .
Puzzle 2. Is ?
If this is true, it should follow that will commute with any function of the operator , for example the function
where is the Kronecker delta, which equals 1 at the origin at zero elsewhere. This operator should be a projection operator, and it should project to an eigenspace of , say
and get a new equilibrium state, say :