When it comes to a changing climate it has been said that ‘proof is in the pudding’, supposedly meaning a close examination of statistical data will quickly and clearly reveal all the facts. But statistics can be misleading at best, and in some cases downright confusing, as in the case of last year’s climate data. 

As most of us know, 2011 was a tough weather year for many U.S. farmers. Exceptional floods, historical droughts, intense and widespread incidents of wildfire, for example, are being credited with costing U.S. agriculture billions of dollars in losses last year.

But recently released climate statistics for 2011 by the National Oceanic and Atmospheric Administration’s National Climatic Data Center indicate that in spite of weather disasters that plagued large areas of the nation, temperature and rainfall averages and other relevant climate data was not significantly different than those for most years.

Or was it? You be the judge.

The average temperature last year for the contiguous U.S. was 53.8 degrees F, just one degree F above the 20th century average. In spite of the single degree difference though, 2011 is proving to be the 23rd warmest year on record. Also by contrast, precipitation across the nation last year averaged near normal, masking record-breaking extremes in both drought and flooding.

To confuse things more, globally, La Niña events helped keep the average global temperature below recent trends. As a result, 2011 tied with 1997 for the 11th warmest year on record and, surprisingly, it was the second coolest year of the 21st century to date, but tied with the second warmest year of the 20th century.

Are you confused yet?

Digging deeper into the statistical data for the U.S., it becomes apparent that while many areas suffered extreme weather, there was a balancing of extremes taking place that make 2011’s climate statistics look more normal than they actually are.