BOSTON (CBS) — No doubt about it – jokes about the weather being wrong are here to stay. No matter how much forecasting improves, this stigma continues. And there’s some truth to that. Forecasts for the weather are not 100%, and probably never will be. It’s a prediction for the future after all, and there is inherent uncertainty in figuring out what will happen hours, days, weeks, and especially months from now.
But it’s a worthwhile exercise to think about how far we have come. As we celebrate 70 years of WBZ, it’s interesting to note that the first tornado warning in the United States was issued exactly 70 years ago. The forecast was made at Tinker Air Force Base in Oklahoma, and it verified! The warnings were different then. It was basically a ‘heads up there could be a tornado tomorrow’ kind of thing. But it was a leap forward at the time. Before the prognostication was made, no one had tried such a thing before. And as a result, storms were much deadlier.
Just a few years later, there was a similar first in local forecasting. The first ever severe thunderstorm warning in New England was issued on June 9th, 1953. And that storm ended up being one of the worst ever experienced in this part of the country. This was the day of the infamous Worcester tornado, which killed 94 people and left 1,300 injured. At the time, it was the costliest tornado in U.S. history and remained at the top of the list until 1979.
To give warning, there wasn’t much of any visual tool on television except chalk and a blackboard. While that almost seems folksy now, you don’t have to go far to find a lack of technology.
“If you go back to the 1980s in southern New England and were awake at 2 in the morning, you would only have 3 weather observations: it would have been Boston, Providence, and Hartford. That’s it,” says Joe Dellicarpini, a meteorologist at the National Weather Service in Norton. Now, there are dozens and dozens of such observations at all times of day across the same area.
Even in the early 90s, the tools we had at hand seem ridiculously out of date to us now. Take radar for example.
“We actually had a scope and the radar spun and we would draw on the scope with a grease pencil where the storms were and put the different levels of how strong they were, then we put a paper overlay with markers trace it, and there’s your radar picture!” recalls Dellicarpini.
Doppler radar, something we think of as a bare necessity now, didn’t come about until the early to middle 90s across the country. Then in 2012, a huge leap to dual-pol radar. Now, we are able to see the size and shape of raindrops and ice crystals in the atmosphere, scan at a much faster clip, instantly detect tornadoes on the ground via radar, etc. Satellite imagery provided by GOES-16 gives us resolution so good that we can see rotating updrafts (updated every minute). Until the past couple of years, these capabilities were impossible. And it is all thanks to the rapid growth of computers.
“The ability to take our science, which is very, very complex, put it into a supercomputer, and have that supercomputer figure out what it means is by far the biggest change for our science,” says Peter Neilley, Director of Weather Forecasting Sciences and Technologies at IBM in Andover.
IBM has invested heavily in ‘the internet of things’ for a variety of purposes, including weather. This is essentially mining all sorts of devices to improve data collection and in turn, things like our weather forecasts. Neilley notes that many smartphones, for examples, have weather sensors in them. Barometers, thermometers, and hygrometers all the size of a sliver of your pinky nail exist in phones, which IBM can tap into via apps like Weather Underground and the Weather Channel.
How does this help? To produce better forecasts in the future, the biggest ingredient we need is more data for computer models to start with (called an initialization). There are gaps in that data over large oceans, at the poles, sparsely populated areas, middle of the atmosphere above the surface, etc. The more companies like IBM can tap into devices to fill in those gaps, the better the inputs for the models can get. Improvements also come from higher resolution satellites with more innovative instruments on them and expanding the radar network. Some companies have experimented with installing small lower-power doppler radars on already standing cell towers to cast a tighter net across a region.
As the gaps decrease and quality increases, the models improve. Combined with increasing computing power, this is the way in which forecasts will get better and better and predict with detail farther out in time.
Is there a limit? Neilley explains that there is an interesting roadblock to consider – the question of whether computing power will continue to grow at an exponential pace or begin to level off. In other words, is Moore’s Law failing (which states that the number of transistors on a microprocessor chip will double every two years or so). The field of meteorology is always one of the first to gobble up new supercomputers, but it’s not a lock that they will increase in strength at the same pace in the years ahead. Quantum computing may be the way around this problem.
Back at the weather service, the big area of research to look for will be the next generation of radar. The NWS is taking a look at phased array radar, which would be a new breed that has no moving parts. It is based on electronics that can change frequencies and scan areas of interest on demand, though that technology is probably at least 12 years away.
There is no doubt that forecasts will continue to improve and the resolution of technology will improve along with it. A 7-day forecast in 2018 is as accurate as a 2-day forecast 50 years ago. With the way things are moving, Dellicarpini says that 7-day could easily double or triple in our lifetimes.
“You’re talking years and years in the future, I think yes, we’ll see more specific forecasts. Maybe out two weeks out, maybe up to three weeks — I don’t see any reason why not,” says Dellicarpini.