The ultimate in weather forecasting

The secret to accurate and timely weather forecasting lies in getting the most sensors out in the field as possible.

John Keller, Editor in Chief

The secret to accurate and timely weather forecasting lies in getting the most sensors out in the field as possible.

The U.S. National Weather Service gathers data from thousands-perhaps millions-of sensors in places ranging from satellites orbiting above Earth to local airports, and even neighborhood elementary schools.

This network of sensors-which includes Doppler weather radar installations, infrared and visible-light cameras on satellites, barometric instruments aboard hurricane-chasing aircraft, and even home weather stations-enables forecasters to spot dangerous weather quickly and sound the alarm for those in its path.

Only last month, meteorologists at the National Weather Service forecast office in Dodge City, Kan., were able to save perhaps hundreds of lives by giving the 1,500 residents of Greensburg, Kan., a precious 20 minutes warning of an F5-strength tornado bearing down on them.

A half hour after the warnings went out, Greensburg was gone. News photos showed what remained of the town; it looked like the 1945 photos of Hiroshima. Not a house, not a business was left standing. Tornadoes don’t get any stronger than F5. Tornado chasers call them “the Finger of God” for good reason.

Greensburg was leveled-it just doesn’t exist anymore-yet only 10 people died. That 20 minutes of warning from the Weather Service made all the difference. If that catastrophe had struck only 30 years ago that 20 minutes of warning would have been more like two minutes, and hundreds would have been dead.

The sensor information available to forecasters in Dodge City is what made it all happen. Think about it-we really haven’t had weather warnings like this for very many years.

People often talk today about the so-called “Blizzard of ’78” that hit New England almost three decades ago. That storm dropped three and four feet of snow, stranded people on interstate highways, kept people from reaching home for days, and required a call up of the National Guard.

People knew snow was coming that day, but they didn’t know how much, and they headed out for work as usual that morning because the Weather Service couldn’t accurately call the storm, and the resulting inconvenience and suffering is firmly ingrained in regional lore. Everyone who was there has a story; just ask.

My mother in law, who’s almost 80, told me about the “Great Hurricane of ’38,” which flattened Long Island and devastated parts of New England almost seven decades ago.

Keep in mind that when a hurricane is coming today-long before the storm actually hits-most people in its path are either hunkered down or evacuated (or hosting hurricane parties if it’s New Orleans). Want to know where my mother-in-law, Hazel Ciarcia, was when the Great Hurricane of ’38 first started tearing through?

She was on a canoe in the middle of Long Pond in Littleton, Mass. She and her neighbors had no idea what was coming until the storm was on them. Hazel and her brother had to paddle for their lives, abandon their canoe at lake’s edge, and make a run for it.

The next day not a tree was standing on their street. Their canoe was wrecked. Families had to rebuild from scratch. She never would have set foot in that canoe if she had any idea of what was in store.

We know what 20 minutes of warning did for the folks in Greensburg, Kan. Now imagine if we could vastly increase the number of sensors available to weather forecasters. Maybe that 20 minutes goes up to half an hour or more. Maybe that number of 10 dead goes down to zero.

Such a future, perhaps, is not so far-fetched because of work at jet engine manufacturer Pratt & Whitney in East Hartford, Conn., and parallel processing expert Mercury Computer Systems in Chelmsford, Mass.

Experts from the two companies are proposing a system of sensors inside the engines of passenger aircraft, coupled with parallel processing technology inside the aircraft fuselage.

These sensors would measure the air passing through the engines and parallel processors and algorithms would make judgments about weather conditions in front of the aircraft. Network this information with other aircraft and the airlines could evaluate atmospheric conditions virtually anywhere.

In this way, airlines might be able to bypass bad weather and turbulence, engine manufacturers might be able to fine-tune their products for optimum performance in a variety of conditions, and pilots might be able to choose the most fuel-efficient paths to their destinations.

All that is nice for air travel, but what about the notion of data-linking this information in real time to the National Weather Service, where other computers could gather weather data from thousands of aircraft operating in a wide variety of conditions and altitudes simultaneously?

Such a system could enable the Weather Service physically to reach out and touch the air almost anywhere in the United States at any given moment. That would make a deep well of data, and could create vast opportunities for improving weather forecasting to be even better than it is today.

“We know there is a magical opportunity for parallel processing in the aerospace industry,” says Rork Brown, strategic engineer for the Pratt & Whitney Applied Technologies Group. “This would be a very big architecture, and a very large undertaking, but the stars are aligning on a number of levels.”

The U.S. Federal Aviation Administration is working on the Next-Generation Air Transportation System, better-known as NGATS, communications networks are evolving quickly driven by the Internet and software-defined radio. Meanwhile, the Cell Processor at Mercury is opening new opportunities for intensive data processing.

“The Cell Processor raised its head and allows multicore computing on one chip,” Brown says. “We at Pratt & Whitney had a communications architecture network, and we simply superimposed that chip on our architecture. Now we have something really tremendous that fits into this whole communications network architecture.”

This kind of capability will not happen overnight, Brown cautions. “I think we’ll see this kind of system perhaps within the next 10 years,” he says.

I can’t wait to see what Hazel thinks about all of this. I know she’ll wish this had been around on that warm September afternoon back in 1938.

More in Communications