The Azimuth Project
Blog - El Niño project (part 4) (Rev #5)

This is a blog article in progress, written by John Baez. To see discussions of the article as it is being written, visit the Azimuth Forum.

If you want to write your own article, please read the directions on How to blog.

As the first big step in our El Niño project, Graham Jones replicated the paper by Ludescher et al that I explained in Part 3. Let’s see how this works!

Today I’ll try to explain this to people who understand programming reasonably well. I don’t. Next time I’ll explain how I actually ran Graham’s software myself, starting from scratch. I hope that will be helpful to a different set of people.

Getting temperature data

The idea is to predict El Niños starting from monthly average surface air temperatures at a grid of locations in the Pacific Ocean:

Here is the result (click to enlarge):

This is almost but not quite the same as the graph in Ludescher et al:

Niño 3.4

In Part 3 I mentioned a way to get Niño 3.4 data from NOAA. However, Graham started with data from a different source:

Monthly Niño 3.4 index, Climate Prediction Center, National Weather Service.

The actual temperatures in Celsius are close to the NOAA data I mentioned last time.
But the anomalies, which actually give the Niño 3.4 index, are rather different, because they are computed in a different way, that takes global warming into account. See the website for details.

Code at Github. It took about 35 minutes to run.

category: blog, climate