Leap to the extreme scale
could break science boundaries
(page 4 of 4)
The fastest computer in the United States as of November 2010 is a Cray XT system at ORNL’s National Center for Computational Science, named Jaguar. With more than 224,000 processors, it has a theoretical peak speed of 2.3 petaflops and has sustained more than a petaflops on some applications, Bland says.
Jaguar enables scientists like Jacqueline Chen of Sandia National Laboratories to study fundamental processes in incredible detail. “Chen has an extensive library of simulation data for combustion modeling,” Bland says. “It can be used to make natural gas-burning generators or diesel engines run more efficiently.”
In another case, Jaguar is helping scientists study superconducting materials to deliver electricity more efficiently. Today’s transmission lines waste about a quarter of electricity generated in the United States to resistance-created heat. Superconductors lose almost no energy to heat, but so far only materials chilled to extremely low temperatures – about -200 °C – achieve this property.
Thomas Schulthess of the Swiss National Supercomputer Center and Institute for Theoretical Physics in Zurich and his colleagues are using Jaguar to seek new materials that become superconducting at room temperature.
But very fast computers already make daily differences in everyone’s life. For example, they help meteorologists forecast the weather.
“Think back to when you were a kid,” Bland says, “and the weather reports for ‘today’ were pretty good, but maybe not perfect, and ‘tomorrow’ was just a tossup. In a seven-day forecast today, the sixth and seventh days may be a little iffy, but you can usually really count on forecasts over the next three or four days.”
A yet more accurate weather forecast might not matter much if it only means knowing which weekend day will be best for boating or golf. But an accurate forecast can mean the difference between life and death if it tells you when and how hard a hurricane will hit your seaside town.
With more powerful computers, scientists could make predictions about weather even further into the future.
“One of the interesting things we want to do is global climate modeling – modeling the entire planet, but with resolution on a regional basis,” Seager says. Such a regional model might be able to predict changes in the cycle that generates winter snows that melt to fill California’s reservoirs. “If the climate heats up to where precipitation comes down as rain instead of snow in the Sierras,” Seager says, “then our water planning and infrastructure would need to radically change.”
But with current computing capabilities, scientists aren’t yet close to developing such models Seager says. That would require an increase in resolution of 10 to 100 times over current models, and Seager would like to see a global climate model with resolution down to a kilometer.
An exascale machine will provide that resolution – and more. It also will enable improved physics. “The current climate models don’t consider much mixing of layers in the atmosphere,” Seager says. “We want a model that allows air from the surface to mix with air higher up, and vice versa.” An exascale model also will include more predictive modeling of clouds and cloud cover, and thereby precipitation.
An exascale machine also could be turned loose on problems in biofuels. Instead of making ethanol from corn or soybeans, it might be made from plant waste or more common plants, like the fast-growing kudzu that plagues much of the Southeast. Cows’ digestive systems use an enzyme to break down the cellulose in these plants and make energy. An exascale computer might find ways to make an enzyme that works fast enough for industrial-scale production of biofuels from waste.
When computer scientists learn how to push computing to a scale that starts to match the stars, such in silico simulations will change more than we can even imagine.