The only constant is change
November 7, 2011 4:24 PM   Subscribe

Will someone tell me what's wrong with the data cited and resulting conclusion here that the last ten years have seen a cooling trend in the United States?

I know the impulse will be to scream Koch Brothers! Rethuglicans! Toolsoftheoilinstryyyyyyy! And, fair enough. But I'm really curious about what's being said rather than who is saying it. If it's misleading, how? What trick is getting pulled?

Thank you.
posted by codswallop to Science & Nature (10 answers total) 5 users marked this as a favorite
 
See this.
posted by mcwetboy at 4:32 PM on November 7, 2011 [22 favorites]


Well, part of it is that "global warming" is a misnomer. That's why it's usually called "climate change" these days. More extreme temperatures are generally the trend-- colder winters and hotter summers. The former is readily apparent; the latter is more subtle (though all the East Coast regions show it). It also ignores an increase in hurricanes and dangerous thunderstorms.

The rest of it is that a decade -- a decade with widespread pollution controls -- is a poor window for looking at something that's been developing since the dawn of the Industrial Revolution. It's slightly better than looking at an unseasonably cool day and saying, "Well where's your precious global warming now?"... but not by much.
posted by supercres at 4:33 PM on November 7, 2011 [4 favorites]


Just from skimming, and not really going in-depth into the numbers:

a) The Global Warming from the BEST study is a global effect. For various reasons, looking at just the USA doesn't make sense (effect is more extreme at the poles, being one that I know of for sure)

b) 10 years is not enough information to show any trend. Especially this data. Look at their summer data -- it's all over the place. Most of it shows a few warm summers a few years ago, and then a few cool summers in the past two or three years that pulls the trendlines down. That's just not enough to make any actual conclusions. In the big graphs at the bottom, if you stopped them at 2007 instead of 2010, the trendlines would go up instead of down. If a couple years of data can skew your results that much, that's just glorifying outliers.
posted by brainmouse at 4:33 PM on November 7, 2011 [1 favorite]


Whenever you are presented with data, ask yourself the following question: has the author clearly described and justified why s/he is using these units of analysis? If the answer is "no," ramp up your skepticism.

In this case, though I am not in any way an expert on this topic, the thing that immediately jumps out at me is: why are they comparing a 99 year average (1901-2000) to a 9-year average (2002-2011)? Perhaps there is a good reason for comparing two wildly different time periods, but they don't bother to give one, which is immediately suspect.
posted by googly at 4:49 PM on November 7, 2011 [1 favorite]


Response by poster: Holy crap, mcwetboy - that is amazing. I look forward to exploring that website.
posted by codswallop at 5:00 PM on November 7, 2011


Saturday's high temperature in NYC was 49 degrees. Yesterday's high was 54. Today's was 62 and tomorrow it is expected to reach 67. Does that trend mean fall is over? Obviously not because four days is way too short a time period to judge the longer-term trend that the Northern Hemisphere is getting cooler because of the relative tilt of the earth to the sun's rays.

Likewise, the variation of temperature over a few years are not necessarily going to accurately reflect a decades- or centuries-long temperature trend. Real Climate has a great graph of this where they plot eight-year trends on top of annual averages over a longer term on a global scale.

Spatially, the same sort of sleight-of-hand is going on. Let's say there are 1000 temperature measuring stations scattered around the world's land masses. If you were to randomly choose 60, or six percent, of them you will probably get a decent approximation of the globally averaged temperature trend. Instead of choosing a random spatial distribution you chose a sample of 60 stations closely lumped together what you'd get, because temperature is spatially correlated, is the trend for that lump of stations. The land mass of the contiguous US is less than six percent of the world's land mass. New York's long-term trend isn't much different than Philadelphia's, which isn't much different than Pittsburgh's, which isn't that much different from Chicago's or Atlanta's or Dallas's.

In short, what Watts is doing is cherry-picking stations to show what he wants to show. His math may be right and show the correct trend for those stations for his chosen time period, but his sample is not representative of what is happening across the globe at a longer time scale.
posted by plastic_animals at 5:15 PM on November 7, 2011


Also read all of this. It's a great overview. And what's nice is that they are continuing to update and add to this, so hopefully it will be around for the count.
posted by zomg at 6:34 PM on November 7, 2011


The US ten-year data just shows annual variation. 2008 and 2009 were cooler than usual (mostly in the north and east, but enough to affect the national average). They were preceded by a couple years that were warmer than usual. So, hey! cooling trend!

Does it mean anything? Probably not.

Towards the bottom of the of the article, he's got charts of US and global temperature 1880 to 2000. The global trend in that period is generally up. In the US, the temperature heads up till the mid thirties (the Dust Bowl), then levels off, and edges up a bit again at the end of the century.

It's mostly an exercise in dicing up data in different ways. You can't make very meaningful extrapolations from small data sets.
posted by nangar at 2:00 AM on November 8, 2011


Good answers here. What I wanted to chime in to say is that ten years is closer to "weather" than "climate", and that picking out a single region to prove or disprove a global phenomenon is simply dishonest.
posted by gjc at 5:00 AM on November 8, 2011


That's why it's usually called "climate change" these days. More extreme temperatures are generally the trend-- colder winters and hotter summers.

I'm not sure if this is true.

I can't back this up, at least not right now, since I didn't save any of this, but in the summer of 2010, when I was hot, sweaty and bored, I compiled cooling/heating degrees info for the Minneapolis area for the last hundred years or so (using info from here and here) and also highest high temps, lowest highs, lowest lows and highest lows from the "Daily Maximum/Minimum Temps" pages by decade here.

What I found really surprised me. July and August of 2010 was really hot here, but over the last 100 years, it didn't really stand out nearly as much as I expected. Summers have been getting somewhat hotter, but over the entire 100+ year span, not that much. There have been nearly as many hellaciously hot summers in the past as now. What was really overwhelming was how much warmer our winters were becoming. Like really, really dramatic. Sorry, I can't remember the exact numbers anymore. But a surprisingly large majority of the warmest winters on record had occurred in the past 15 or 20 years or so.

I was only looking at Twin Cities info (apologies to gjc -- I'm sure you're right that looking only at a single region is a really bad idea) and I'm not an expert in meteorology, statistics or anything else meaningful, so take all this with a GIGANTIC lump of salt, but that's what I observed.
posted by marsha56 at 2:20 AM on November 9, 2011


« Older Upgrading a friend's computer from XP to 7. I...   |   At the risk of finding out I can't shop at the... Newer »
This thread is closed to new comments.