The Azimuth Project
Blog - El Niño project (part 3) (Rev #2, changes)

Showing changes from revision #1 to #2: Added | Removed | Changed

This is a blog article in progress, written by John Baez. To see discussions of the article as it is being written, visit the Azimuth Forum.

If you want to write your own article, please read the directions on How to blog.

This paper tries claims to there’s predict a 3/4 chance that the next El Niño using will a arrive climate by network: the end of 2014:

• Josef Ludescher, Avi Gozolchiani, Mikhail I. Bogachev, Armin Bunde, Shlomo Havlin, and Hans Joachim Schellnhuber, Very early warning of next El Niño, Proceedings of the National Academy of Sciences, February 2014. (Click title for free version, journal name for official version.)

Since it claims there’s a 3/4 chance that the next El Niño will arrive by the end of 2014, and it was published in a reputable journal, it created a big stir. Folks But that’s not the main reason we at the Azimuth Project want to replicate analyze and then criticize or improve this paper. paper! So, The I main want reason to is start that by summarizing what it says. uses aclimate network.

What Very I’ll roughly, write is heavily based on the work idea is this. We draw a big network of dots representing different places in the Pacific Ocean. We connect two dots with an edge if the sea surface temperatures at those two places are strongly correlated. The paper claims that when a El Niño is getting ready to happen, we get aDavid Tanzerlot , a of software edges developer this who way. works In for other financial words, firms temperatures in New a York, big region of the Pacific Ocean tend to go up and down in synch!Graham Jones, a self-employed programmer who also works on genomics and Bayesian statistics. These guys have really brought new life to the Azimuth Code Project in the last few weeks, and it’s exciting!

We’re Whether also relying a lot on this earlier idea paper: is right or wrong, it’s interesting—and it’s not very hard for programmers to dive in and study it. Two Azimuth members have done just that:David Tanzer, a software developer who works for financial firms in New York, and Graham Jones, a self-employed programmer who also works on genomics and Bayesian statistics. These guys have really brought new life to the Azimuth Code Project in the last few weeks, and it’s exciting!

Soon I’ll start talking about the programs they’ve written, and how you can help.
But today let me summarize the paper by Ludescher et al. The methodology is also explained here:

• Josef Ludescher, Avi Gozolchiani, Mikhail I. Bogachev, Armin Bunde, Shlomo Havlin, and Hans Joachim Schellnhuber, Improved El Niño forecasting by cooperativity detection, Proceedings of the National Academy of Sciences, 30 May 2013.

The basic idea

The basic idea is to use a climate network. A There are lots of variants on this idea, but here’s a simple one. We start with a bunch of points representing different places on the Earth. We draw an edge between two points if the weather at those two place is strongly correlated… in some way that we get to decide. This gives us a bunch of points and edges between points, or in other words, an climate undirected network graph . is That’s a our undirected climate graph network! whose Then nodes we represent calculate points stuff in about a this spatial network. grid, and where thelink strength between nodes ii and jj is calculated from the historical weather record at those two points. There are lots of ways we could do this. For example, we could compute the cross-correlation of temperature histories at these points ii and jj, and some function of this could be our link strength.

In There a are lots of ways to fill in the details. For example, for any pair of pointsweighted graph approach, we attach a number, the link strength, to the edge connecting ii and jj . In we could compute theunweighted graphcross-correlation formulations, of we temperature histories at these points. We could use some say binary that decision rule to specify whetherii and jj are connected by an edge or if not. the cross-correlation is bigger than some value.

These The papers try to predict El Niños by studying Ludescher correlations between daily temperature data for “14 grid points in the El Niño basin and 193 grid points outside this domain”, as shown here:et al try to predict El Niños by studying correlations between daily temperature data for “14 grid points in the El Niño basin and 193 grid points outside this domain”, as shown here:

The red dots are the points in the El Niño basin.

Starting from this temperature data, they do build a calculation climate which network I that will changes explain with later: time. that’s And the starting heart from of that, their they paper. calculate They a then number. use When the this result number to is predict bigger the than a certain fixed value, they claim an El Niño is coming.Nino3.4 index, which is the area-averaged sea surface temperature in the yellow region here:

How do they decide if an El Niño actually arrives? One way is to use the ‘Nin&ntildelo 3.4 index’. This the area-averaged sea surface temperature anomaly in the yellow region here:

Anomaly means the temperature minus its average over time: how much hotter than usual it is.

Here is what they get:

The red line is the Nin&ntildelo 3.4 index. When this gets above…

The details

For any f(t)f(t), denote the moving average over the past year by:

f(t)=1365 d=0 364f(td)\langle f(t) \rangle = \frac{1}{365} \sum_{d = 0}^{364} f(t - d)

Let ii be a node in the El Niño basin, and jj be a node outside of it.

Let tt range over every tenth day in the time span from 1950 to 2011.

Let T k(t)T_k(t) be the daily atmospheric temperature anomalies (actual temperature value minus climatological average for each calendar day).

Define the time-delayed cross-covariance function by:

C i,j t(τ)=T i(t)T j(tτ)T i(t)T j(tτ) C_{i,j}^{t}(-\tau) = \langle T_i(t) T_j(t - \tau) \rangle - \langle T_i(t) \rangle \langle T_j(t - \tau) \rangle
C i,j t(τ)=T i(tτ)T j(t)T i(tτ)T j(t) C_{i,j}^{t}(\tau) = \langle T_i(t - \tau) T_j(t) \rangle - \langle T_i(t - \tau) \rangle \langle T_j(t) \rangle

They consider time lags τ\tau between 0 and 200 d, where “a reliable estime of the backround noise level can be guaranteed.”

Divide the cross-covariances by the standard deviations of T iT_i and T jT_j to obtain the cross-correlations.

Only temperature data from the past are considered when estimating the cross-correlation function at day tt.

Next, for nodes ii and jj, and for each time point tt, the maximum, the mean and the standard deviation around the mean are determined for C i,j tC_{i,j}^t, as τ\tau varies across its range.

Define the link strength S ij(t)S_{i j}(t) as the difference between the maximum and the mean value, divided by the standard deviation.

They say:

Accordingly, S ij(t)S_{i j}(t) describes the link strength at day t relative to the underlying background and thus quantifies the dynamical teleconnections between nodes ii and jj.

Niño 3.4

Niño 3.4 is the area-averaged sea surface temperature anomaly in the region 5°S-5°N and 170°-120°W. You can get Niño3.4 data here:

Niño 3.4 is just one of several official regions in the Pacific:

  • Niño 1: 80°W-90°W and 5°S-10°S.
  • Niño 2: 80°W-90°W and 0°S-5°S
  • Niño 3: 90°W-150°W and 5°S-5°N.
  • Niño 3.4: 120°W-170°W and 5°S-5°N.
  • Niño 4: 160°E-150°W and 5°S-5°N.

For more details, read this:

category: blog, climate