Grand challenges, revisited

In 1995 we ran a BYTE feature entitled The Grand Challenges. The story began:

Even using the biggest, most powerful computers available today, we still cannot reliably predict the weather for next week in any significant detail. And this is just one of a number of problems that we would like to solve using computers, but which are so complex that they are essentially beyond the abilities of current technology.
Recently, I've noticed that weather forecasts are pretty darned reliable. For those of us living through this weirdly sun-deprived spring in New England, that's been rather depressing. We'd love the prediction of yet another cloudy week to be wrong, but it probably won't be.

A story in the current National Geographic confirms this trend:

These days a forecast of the daily high temperature in advance is likely to be about five degrees off in the United States, two degrees better than in 1975. Flash flood warnings have improved from 7 minutes in 1987 to 47 minutes in 2004.
...
Satellite images now help meteorologists make forecasts 72 hours in advance that are as reliable as the 36-hour forecasts of 25 years ago.

The story features IBM's Deep Thunder project which pushes the envelope on short-term high-resolution forecasting. New modeling algorithms are part of what makes this possible, according to the project's FAQ. But mainly it's a function of cheaper/stronger CPUs and GPUs.

As the forecasting challenge begins to succumb to Moore's law, it's time to set some new goals. Here's mine: I want to find the butterfly whose flapping wings will make it sunny here this week!


Former URL: http://weblog.infoworld.com/udell/2005/06/06.html#a1244