to fuel fusion energy
Posted May 26, 2011
Part of a series.
If clean, abundant energy from nuclear fusion – the process that generates the sun’s power – becomes a reality, powerful supercomputers will have helped get us there.
Detailed computational models are key to understanding and addressing the complex technical challenges associated with developing this carbon-free energy source that produces no significant toxic waste and burns hydrogen, the most abundant element in the universe.
Accurate, predictive simulations present computational challenges that must be overcome to harvest important information on the fundamental reactions powering ITER, the multibillion-dollar international effort to build a fusion experiment capable of producing significantly more energy than it consumes.
The models that will point the way to commercially viable fusion energy will depend on exascale computing and other new technologies developed along the way. Access to exascale computers has the potential to accelerate progress toward a clean-energy future, says William Tang, director of the Fusion Simulation Program at the Princeton Plasma Physics Laboratory at Princeton University.
“We have good ideas how to move forward,” Tang says, “but to enhance the physics fidelity of the simulation models and incorporate the added complexity that needs to be taken into account will demand more and more powerful computing platforms and more systematic experiments to validate models.”
Scientists now focus simulations on single aspects of plasma physics. A realistic model must integrate multiple physical simulations, and that will require the next generation of computing power. Next-generation computers, expected to come on line within the decade, will be capable of an exaflops – 1 million trillion (1018) calculations per second – about 1,000 times more powerful than today’s fastest machines.