November 2012

Accelerating a cosmic code

Even as it runs at multipetaflops speed, researchers are out to make HACC, the Hardware/Hybrid Accelerated Cosmology Code framework, run even better as part of a collaboration involving six Department of Energy laboratories.

They recently began a three-year project, “Computation-Driven Discovery for the Dark Universe,” with support from the DOE Office of Science’s Scientific Discovery through Advanced Computing (SciDAC) program. With researchers from Brookhaven, Lawrence Berkeley and Los Alamos national laboratories and from the Fermi and SLAC national accelerator laboratories, Argonne National Laboratory scientists will work on further developing HACC and other cosmological codes. They’ll focus on codes that incorporate multiple physics and span a range of length scales. The group also will research codes that both track particles and use computational grids and can run on diverse computer architectures.

“There’s a lot of focus on improving HACC even further so it runs faster,” says Katrin Heitmann, who leads the project at Argonne. The project expects to use supercomputers at the Argonne and Oak Ridge leadership computing facilities and at the National Energy Research Scientific Computing Center at Lawrence Berkeley.

With improved models and new sky surveys coming on line, the scientists hope their codes will help cosmologists gain new insights into the nature of dark energy and dark matter.