forecast forecast
December 6, 2005 11:09 AM   Subscribe

Is there a measure for the accuracy of weather predictions? Have predictions been increasing in accuracy recently? Over the last decades? Any notable major changes? Is a temperature prediction much more accurate than a precipitation % forecast?

What about the difference in variance in accuracy for a 'tomorrow' forecast compared to a forecast for 5 days from now? Should I expect any advances in forecasting accuracy? Any sites to see these statistics visually or in raw data? Any info is good. Thanks.
posted by foraneagle2 to Technology (7 answers total)
 
What you refer to is called "forecast skill". Unfortunately a vast amount of research is now poured into measuring skill of numerical models rather than on public forecasts (i.e. the NWS, or if you don't mind litigation, AccuWeather). So I'm not sure you'll get the results you want. But this and this might put you on the right track.
posted by rolypolyman at 11:22 AM on December 6, 2005


I know I heard on NPR that the forecasting of hurricane tracks has gotten substantially better (by an order of magnitude?) in the past decade. I wish I had more info for you.
posted by OmieWise at 11:29 AM on December 6, 2005


Tornado prediction and tracking has become much more accurate, this is probably a result of high resolution doppler than anything else.
posted by geoff. at 12:29 PM on December 6, 2005


Here's a related Straight Dope article

With regards to short-term forecasts:
On the question of how reliable the figures are, the amazing truth is that they are absolutely 100 percent reliable all the time--that is, presuming the raw data were fed in properly and the calculations done correctly. Remember, we're just talking about probabilities here. A probability isn't "wrong" if it tells you there's only a 10 percent chance of rain and it rains anyway; it's only wrong if a series of such predictions doesn't pan out over the long haul.
posted by Brian James at 12:51 PM on December 6, 2005


Accuracy would depend a lot on the specific type of weather system you are expecting (or not expecting) to encounter. It might still be useful to look at how weather prediction accuracy is changing en mass, but that kind of number wouldn't help you to know if the forecast of snow tomorrow is reliable.
posted by Chuckles at 2:32 PM on December 6, 2005


they are absolutely 100 percent reliable all the time--that is, presuming the raw data were fed in properly and the calculations done correctly
Actually the article is not entirely true. We can feed in the raw data and do the calculations, but the representation of some complex physical processes such as heat transfer must be parameterized, and more raw data is always needed than what we can supply. Just a minor point.
posted by rolypolyman at 2:42 PM on December 6, 2005


it's only wrong if a series of such predictions doesn't pan out over the long haul.

Right. Which is what we're interested in here. An isolated probability prediction isn't wrong, but if a weather forecaster predicts 10 percent chance of rain on 1000 different days, and of those days, there was rain on 900 of them, what you got there is a crappy forecaster. There are statistical methods available to analyze this kind of thing. So the question is, has it been done?
posted by ikkyu2 at 4:51 PM on December 6, 2005


« Older How to pick an HDTV antenna?   |   Simple animation tool as an alternative to Flash Newer »
This thread is closed to new comments.