Climate
February 2015

Weather extremists

Pacific Northwest National Laboratory modelers turn to extreme-scale computing to simulate climate and weather extremes like the ‘Pineapple Express’ that drenched the West Coast this winter.

The “Pineapple Express” is an atmospheric river, seen here in an integrated water vapor (IWV) model branching off an equatorial storm and reaching from Hawaii to the U.S. West Coast. Atmospheric rivers like this one can transport moisture up to 20 times that of the Mississippi River’s flow. Courtesy of the Department of Energy’s Atmospheric Radiation Measurement program.

Throughout the winter, flooding has punctuated California’s extreme and lengthy drought. Rains have slammed the Bay Area particularly hard. Greenhouse gases and aerosols – minuscule airborne particles, such as pollution, dust and salt – may be exacerbating these weather extremes.

To understand what leads to such deluges and to improve precision in modeling extreme weather and climate processes, researchers at Pacific Northwest National Laboratory (PNNL) have been running supercomputer simulations under the U.S. Department of Energy (DOE) ASCR Leadership Computing Challenge (ALCC).

PNNL Laboratory Fellow Ruby Leung and colleagues have simulated heavy precipitation resembling that of the December deluge. Philip Rasch, PNNL’s chief scientist for climate science, modeled the effects of the water cycle, the aerosol and greenhouse gas processes, and emission sources.

Leung’s work focuses on the water cycle, particularly events such as atmospheric rivers – narrow precipitation filaments, captured in satellite images, from the Pacific tropics that move eastward and pump water vapor into the far western United States during the winter. One famous atmospheric river is the “Pineapple Express,” thought to contribute greatly to extreme rainfalls in the Pacific Northwest. The floods this winter happened when atmospheric rivers made landfall.

But these giant systems are just part of the story. Like politics, all precipitation is local. Rain can fall at one locale while it’s dry just 10 kilometers (km) away. Yet climate models represent precipitation as a kind of frequent Seattle-like drizzle. Real rain (or snow) may be irregular and fall in anything from a downpour to a mist.

To figure out what’s happening, ‘we can’t just look at features in the mountainous far West. We have to simulate what happens in the tropical area.’

Most clouds form by convection during summer, when warm surface air rises rapidly in concentrated small areas and water vapor cools and condenses. The variability of topography and the weather in a given place account for convection’s local flavor, Leung says. “When you have a hotter temperature and unstable atmosphere, for example, you have more convection going on, more vigorous cloud building and maybe stronger precipitation. One of the major weaknesses of climate models is that we don’t have a good way of representing this convection process because resolution is not high enough.”

Resolution is a measure of the simulation’s precision. It’s like taking photos with a digital camera: The more pixels, the higher the resolution. Similarly, climate models can achieve higher resolution by breaking the atmosphere into smaller and smaller pieces, like pixels, that when taken together provide a complete picture.

At each point where the modeled space is divided horizontally and vertically, computers solve equations that represent wind dynamics, energy changes, terrestrial ecosystem processes, ocean and sea ice dynamics, cloud formation and other physical processes.

But high resolution has a cost. “If you break up the globe into grids of smaller and smaller sizes, your model must do more computing and it needs much more computer power to solve the differential equations,” Leung says. “We have to cover the whole globe. Imagine a horizontal grid with vertical depth, and the surface area is about 150 km by 150 km on each side. Representing whatever happens inside this area by the average conditions solved by differential equations does not adequately describe processes that happen inside the grid.”

Leung and colleagues ran their high-resolution simulations last year with the global Community Atmosphere Model 5 (CAM5) and the Weather Research and Forecasting (WRF) regional model. They burned through 18 million processor hours, most of them on the Oak Ridge Leadership Computing Facility’s Titan supercomputer, a Cray XK7. She used WRF to model the atmosphere at 4-km resolution and CAM at variable resolution globally, enabling higher-resolution models of specific regions.

By dropping to 4-km resolution, Leung captured peak precipitation, which in the central United States often occurs at night or in early morning, unlike other regions where it often peaks in the afternoon. She says the model correctly reflects convection and thus the regional differences in peak precipitation during the day and its spread from the Rocky Mountains to the central Great Plains. “In the past, if you ran the model with 150-km resolution, the timing of precipitation would peak at about 2 p.m., which is totally incorrect.”

To figure out what’s happening, “we can’t just look at features in the mountainous far West. We have to simulate what happens in the tropical areas where the transport of moisture begins.”

Toward that end, Leung and collaborators from DOE, NASA, the National Oceanic and Atmospheric Administration, and National Science Foundation are gathering data in the Pacific and California during January and March 2015 as part of a project called CalWater. “We will run some global simulations at higher resolution to compare our climate model with data we are collecting in this winter campaign.”

Rasch used the Community Earth System Model 1 (CESM1) for his 2013 simulations, also at Oak Ridge, with 36 million processor hours. PNNL scientist Jin-Ho Yoon, who worked with Rasch on the project, says results suggest that, like previous findings, human activities play significant roles in driving extreme weather. The PNNL team’s results will be published in papers now in preparation.

Yoon says the team tuned CESM1 to reflect greenhouse gas and aerosols conditions and to explore changes such forcing creates in the model. “In the future, we want to look more closely at climate extremes using an amped-up next-generation model being developed by DOE under the Accelerated Climate Modeling for Energy Project (ACME).”

The next-generation climate model, expected by the end of 2016, is based on CESM1 but will resolve at 25 km, Yoon says, compared with the current 100 km. It will boost efficiency and add rigor to the physics being modeled but also will require greater computing resources than the current model.