Beyond petaflops

Oct. 1, 2000
There are real-world needs for the inexorable onward march of computing power, and one of NASA's top computer scientists has a short-list of projects that should push the computer industry at least another half dozen orders of magnitude in performance

by John Rhea

SILVER SPRING, Md. — There are real-world needs for the inexorable onward march of computing power, and one of NASA's top computer scientists has a short-list of projects that should push the computer industry at least another half dozen orders of magnitude in performance.

Click here to enlarge image

Bill Feiereisen, chief of the Numerical Aerospace Simulation Systems Division at the space agency's Ames Research Center in Mountain View, Calif., singles out long-range weather forecasting as the most demanding task of all — and the one the industry could focus on to develop the technology base that the other demanding applications will need.

Right now, he says, the National Weather Service's computers work in the range of high megaflops to low gigaflops (millions and billions of floating point operations per second, respectively) and that's only good for 24 to 36 hours worth of forecasts that the forecasters can live with. Moreover, Feiereisen adds, the grids bounding weather phenomena are 50 kilometers apart. He'd like eventually to get them down to 10 meters.

A teraflop is one trillion, or 1012, floating point operations per second. At this level Feiereisen figures the forecasters could get a grip on the three-dimensional nature of weather, push down the resolution to about five kilometers, and make accurate forecasts out to three days.

NASA experts are already creating petabyte databases at their Goddard Space Flight Center in Greenbelt, Md., to handle the overwhelming volume of Earth resource data pouring in from satellites. A petaflop — what would seem the next logical step — is one quadrillion, or 1015, floating point operations per second.

But Feiereisen's vision extends beyond that. Speaking at a seminar sponsored by Silicon Graphics Inc. of Mountain View, Calif., at the Silver Spring, Md., offices of its SGI Federal unit, he called for three more orders of magnitude in improvement over the next five to 10 years to what he calls "exaflops." This term refers to one quintillion, or 1018, floating point operations per second. This should get the weather forecasts out to 10 days, he says.

To put these mind-boggling numbers in perspective, consider this analogy: if you had a million dollars in thousand dollar bills, you'd have a stack about four inches high. If you had a billion dollars — again in thousand dollar bills — your stack would be 500 feet high, about as tall as the Washington Monument. And to take the analogy one stop further, a trillion dollars worth of thousand dollar bills would be a hundred miles high.

Note the pattern here. Boosting computer power by six orders of magnitude, from megaflops to teraflops, essentially triples forecasting capability from a day to three days, and another six orders of magnitude from teraflops to exaflops triples it again to 10 days. Beyond that level — or perhaps before it — the industry may run into the brick wall of the laws of physics, but that shouldn't stifle anybody's imagination.

In fact, extrapolating Moore's law has long been a favorite sport for spectators and participants. As currently understood, the law formulated by Intel Corp. co-founder Gordon Moore calls for doubling performance every 18 months with no increase in price, but I remember that in the early 1970s it was originally expressed as two orders of magnitude per decade. It works out about the same either way, and the current version is probably more comprehensible for people who are uncomfortable with scientific notation.

What makes the weather forecasting problem so intriguing is that it could be the stalking horse for driving up computer power levels to tackle the other tough applications. Partisans of computer power once thought the Strategic Defense Initiative would accomplish this in order to perform the seemingly impossible task of discriminating incoming nuclear warheads from decoys, destroying the warheads, and then confirming that they had been destroyed. That possibility remains elusive.

Leaving aside the questions of whether adequate sensor data could be acquired or whether chaos theory rather than the laws of physics would be the ultimate brick wall, the quest for long-range weather forecasting would be a powerful stimulus even if it its goals are never achieved.

Also on Feiereisen's list is computational fluid dynamics (CFD), long a specialty at the Ames center in its ongoing effort to serve as an electronic alternative to traditional wind tunnel testing of new aircraft designs. The value of CFD to determine complex boundary layer conditions of high-performance aerospace vehicles is obvious, and Ames has a steady business for military aircraft of all kinds as well as NASA's own designs for post-Space-Shuttle launch vehicles.

Feiereisen suggests that CFD concepts could extend into such other areas as simulating biological processes, such as pumps for artificial hearts, and nanotechnology, in which devices could suck out cholesterol molecules from the bloodstream.

Or how about air traffic control? The growing volumes of air traffic can be extrapolated to reach crisis levels in the near future. Will Rogers once said that land was the best investment because "they ain't making any more of that." They ain't making any more airspace either.

The Federal Aviation Administration is mightily interested in finding computer solutions in its attempt to "deconflict" airspace.

Speaking at the same conference, Henry Dardy, chief scientist for advanced computing at the Naval Research Lab (NRL), in Washington disclosed some comparable extrapolations of how more computer power can solve some of the Navy's needs.

Today, Dardy says, operational personnel have about a gigaflop available at their desktop workstations to produce what he calls one "product" an hour. A product, by his definition, is a task such as analyzing an image to determine if a pilot took out a target.

What he wants in the near term is 10 to 100 gigaflops to produce several products an hour. Then he wants to move up to petaflops to produce the products dynamically in seconds.

NRL's role in this process is to serve as the point of technology incubation and insertion to perform the Navy's mission. He is aiming at an information infrastructure that will, in his words, "look local, act global, and stay one step ahead of the bad guys." (The bad guys nowadays are defined as hackers, not Russians.)

What the government customers like Feiereisen and Dardy are trying to tell industry is that computer power is not a solution looking for a problem. There's no lack of computationally intensive problems, and unless Moore's law is repealed the customers should get their solutions eventually. They'd prefer sooner to later.

Voice your opinion!

To join the conversation, and become an exclusive member of Military Aerospace, create an account today!