We the AI trainers
A Gordon Bell-nominated team aims to put the power behind large language models into the hands of ordinary people.
A Gordon Bell-nominated team aims to put the power behind large language models into the hands of ordinary people.
With Argonne’s Polaris supercomputer, a University of Michigan-led team bets on large language models to improve batteries.
A PNNL physicist turned from basic to applied research: biopreparedness.
A University of Kansas aerospace engineer aims high-performance computing at intricate airflow patterns to improve aircraft flight.
PPPL looks to magnetic mirrors to produce small, economical fusion devices.
Northwestern University DOE Early Career Award recipient speeds alternative-energy discoveries.
Weill Cornell team uses powerful computing to decipher the molecular mechanics of healthy and diseased cells.
University of Maryland DOE Early Career researcher explores flexible, dynamic architectures for high-performance computers.
A USC computer scientist wants to produce quantum materials at scale, with help from Argonne supercomputers.
Computation reveals details about how matter falling into these massive cosmic objects and the energy released from them affect the universe’s evolution.