In this project, Dr. Alex Hall and his team at the UCLA Center for Climate Science undertook a comprehensive investigation of future climate in California’s Sierra Nevada. The team used an innovative method they developed to produce fine-scale climate change projections. Unlike past studies, these projections take into account key physical processes that affect the rate of snow loss under warming. Using these projections, our team is answering key questions about the fate of the Sierra Nevada snowpack, a critical natural resource that not only supports an iconic ecosystem but also provides freshwater to millions of Californians.
Sierra Nevada snowpack is an important water source for California. It acts as a natural reservoir that holds water in frozen form until it gradually melts over spring and summer, flowing into manmade reservoirs and conveyance systems. Past studies have shown that in the future, global warming is expected to shrink the Sierra snowpack. If California is to adapt to these changes, we need to know the specifics: How much warmer will it get? Just how much snow will we lose overall? How much earlier does snow melt and run off? Do soils dry out faster after snow has melted? Are all elevations and all watersheds affected to the same degree? If we act to reduce greenhouse gas emissions, can we prevent these changes?
The scientific challenge
Answering these questions requires the help of global climate models, powerful computing tools that simulate the climate system. Global climate models are the best tools we have for projecting future climate. But they are too low in spatial resolution to accurately simulate future climate in topographically complex areas like the Sierra Nevada, where different elevations experience different climatic conditions. There are several methods by which climate scientists can “downscale” global climate model information to create higher-resolution future climate projections. Some of these methods are dynamical, meaning they use a regional climate model, a high-resolution cousin of a global climate model, to simulate future climate. Dynamical downscaling is physically realistic but is very expensive, computationally speaking. Other downscaling methods are statistical, using mathematical shortcuts to produce higher-resolution projections. Statistical models are computationally cheap and quick to run, but they don’t necessarily represent the physical dynamics of local climate.
A physical climate phenomenon that’s especially important to represent when studying snow is called snow albedo feedback. Albedo is a measure of how much sunlight is reflected by a surface. Snow has a high albedo, meaning it reflects a lot more sunlight than it absorbs. Other land surfaces have lower albedos, meaning they absorb more sunlight than snow. Snow albedo feedback occurs when warming causes snowpack to shrink at its margins. The ground that is uncovered absorbs more sunlight than snow would have, which enhances the warming at that location. The enhanced warming melts more snow, which exposes more sunlight-absorbing ground, which further enhances warming, and so on. In other words, snow albedo feedback is a feedback loop that leads to greater local warming than would be expected from atmospheric warming alone. If a study doesn’t account for it, snow loss could be significantly underestimated.
Our team has developed an innovative approach called hybrid downscaling, which combines the strengths of dynamical and statistical downscaling. First, our researchers run a limited number of dynamical model simulations that represent the full span of global climate model outcomes. Then, they build a statistical model that mimics the dynamical one. The result is a set of future climate projections that is both comprehensive and physically credible. The team first used this approach in a first-of-its kind study of climate change in the greater Los Angeles region. Now they are applying it to project future climate in the Sierra Nevada.
In this project, our team first created a simulation of the historical climate from 1981 to 2000 at a spatial resolution of 3 kilometers, or about 1.9 miles. Then they projected future climate at 2041–2060 and 2081–2100 at the same resolution. Future projections represent the full ensemble of latest-generation global climate model output, which underlies the most recent climate change assessment report by the Intergovernmental Panel on Climate Change (IPCC). The team created projections for different greenhouse gas scenarios used in the latest IPCC assessment. These include a scenario that roughly represents the present emissions-reductions goals of the recent Paris climate accord, which we call “mitigation,” and a scenario that represents “business as usual.”
To date our team has analyzed future changes to temperature and snow cover, as well as impacts on snow during drought.
Key findings on temperature and snow cover:
- By 2081–2100, if nothing is done to curb greenhouse gas emissions, average temperatures in the Sierra Nevada are projected to increase by about 7–10 degrees F, depending on the month in question, compared with 1981–2000.
- Warming will be associated with decreases in snow cover in the Sierra Nevada by 2081–2100. For the typical month of April, the land area covered by snow shrinks by 48%, compared with the typical April in 1981–2000.
- Warming and snow cover loss are greatest under a “business as usual” scenario of greenhouse gas concentrations. Under a “mitigation” scenario, warming and snow cover loss still occur but are less severe.
- A key factor in the severity of warming and snow cover loss is snow albedo feedback, a feedback loop in which warming causes snow cover loss, and snow cover loss increases warming. Ours is the most comprehensive study of future climate in the Sierra Nevada to date because it takes into account this feedback loop and considers multiple greenhouse gas scenarios.
Key findings on snow during drought:
- In the “snow years” (November–June) of 2011–2012 through 2014–2015, human-caused warming to date reduced average Sierra Nevada snowpack levels by 25%, compared with a climate model simulation without human-caused warming.
- Middle and low elevations (up to about 8,000 feet) saw even greater reductions, ranging from 26% to 43%.
- In a model simulation of the recent drought under the warming conditions expected at 2081–2100 under a “business as usual” scenario of greenhouse gas emissions, average snowpack is reduced by 85%, compared with what actually occurred in winters of 2011–2012 through 2014–2015. Nearly all snow is lost at elevations below 8,000 feet.
- Loss of snow in drought years will be made worse by climate change no matter which greenhouse gas emissions pathway the world follows.
Studies on snowpack, the timing of runoff from precipitation and snowmelt, and soil moisture are forthcoming. They will provide insight into climate change impacts not only on water resources but also on wildfire, ecosystems, and recreation.
In addition to publishing our findings in science journals, in 2017 we will publish a comprehensive report on the project. This report will synthesize our findings and explain their implications for policymakers and the public. Throughout 2017, Dr. Hall will also give a series of public talks about the project in the Lake Tahoe area, Sacramento, San Francisco, and other locations. For more information, stay tuned to our Events page or email Katharine Reich.
More about the project
Primary funding for this work was provided by the Metabolic Studio in connection with the Annenberg Foundation. Additional support was provided by the National Science Foundation, the US Department of Energy, the Luskin Center for Innovation, and the Sustainable LA Grand Challenge.