Showing changes from revision #10 to #11:
Added | Removed | Changed
This page is about this paper:
This paper tries to predict the next glacial cycle with a stochastic model, using a stochastic differential equation derived from a deterministic model from this paper:
The authors investigate a three-dimensional dynamical system where the variables are
ice volume
atmospheric concentration
deep-ocean temperature
Saltzman and Mosch considered a deterministic system of this sort obeying the equations
The term describes time-dependent external forcing caused, for example, by Milankovitch cycles. Note that these equations are linear except for the cubic nonlinearity in the carbon dioxide concentration. If we remove the external forcing we see that the system has static solutions, also known as fixed points, when:
These fixed points corresponding to equilibrium climates. There may be one, two, or three, depending on how many solutions the second equation has. This depends on the parameters and . When there are three solutions, there are presumably two stable ones corresponding to ‘greenhouse’ and ‘icehouse’ climates, separated by an unstable solution. An effect like this has been widely invoked to explain the glacial cycles.
This model is mathematically similar to the model of Snowball Earth studied by Zaliapin and Ghil—compare the Snowball Earth page for more details. However, Zaliapin and Ghil consider a simpler model where the only variable is , the amount of ice. For them, the cubic nonlinearity, and the presence of ‘greenhouse’ and ‘icehouse’ phases, come from the ice albedo effect: lots of ice causes lots of reflected sunlight so the Earth stays cool, little ice causes lots of absorbed sunlight so the Earth stays warm. In the Saltzman-Maasch model, by contrast, the cubic nonlinearity involves , the carbon dioxide.
What was Saltzman and Maasch’s reason for positing this nonlinear CO2 effect? Is this more physically reasonable than using the ice albedo effect as a mechanism to create two stable phases?
Not, really an answer, but from
The diffculty for accepting Saltzman’s models as a denitive theory lies in the physical interpretation of the CO2 equation. This equation encapsulates in this model all the interesting dynamics of the system and it is thus crucial to the theory. Some semi-empirical justication for the CO2 equation is given in ref. [44] but the form of this equation has undergone some somewhat ad hoc adjustments in SM90. The form present in SM91 is again dierent, with important effects on the bifurcation structure, while the authors did not justify this latter change based on physical or biogeochemical considerations.
where [44] points to
Crucifix and Rougier take the Saltzmann–Maasch model and turn it into a stochastic differential equation by adding three uncorrelated white noise terms:
These equations correspond to the deterministic equations 5a-c) of their paper, augmented by noise according to equation 8.
The authors treat some of the parameters of the model as fixed, and some as random variables. They do a Bayesian analysis on past climate data to estimate these parameters, and then predict when the next glacial would begin assuming no global warming caused by human interference. In fact a number of papers suggest that global warming will delay the inception of the next glacial.
Do the authors draw the parameters on every time step from the according distribution, or do they draw them once for each trajectory aka “particle”?
On page 25 there is the following statement:
At each time step, the algorithm considers n (here : n = 50000) samples of model parameters and states called ‘particles’. Particles are propagated forward in time according to a three-step process: (1) forward integration of the stochastic equations; (2) particle weight estimation as the product of prior weight and a likelihood taking into account the observation and its uncertainty; and (3) particle resampling in order keep a set of particles that all have approximately the same weight (importance resampling). We emphasise that this algorithm is a filter and not a smoother. Consequently, posterior estimates at any time t are only informed by data prior to time t.
Tim van Beek: I don’t understand anything after (2)…
First aid for item (2) can be found here:
See page Importance sampling and Stochastic filter for the start of a discussion.
This book covers many algorithms of the type used here:
The paper by Saltzman and Maasch is a followup to
Can someone find this free online?
Other papers by Saltzman and Maasch will probably be useful: