Trillion-particle simulations take on the biggest big-data problems that exist.
A University of Utah team is turning to high-performance computing to design a clean oxy-coal boiler.
PETSc, a library of ready-to-use high-performance computing software, is gaining renown for its simulation-tackling tools.
Data efficiency blooms with the ROSE Framework.
This SC14 Test of Time-winning research from the mid-1990s made big computational models amenable to parallel processing.
A Colorado School of Mines professor’s mathematical methods probe inputs to cut problem size and ease computation.
Researchers seek new ways to help supercomputers bounce back.
An Oak Ridge National Laboratory team builds a new way to follow the bouncing neutrons.
Energy-materials research provides a test for the big-data flood.