We found a match
Your institution may have rights to this item. Sign in to continue.
- Title
Using Machine Learning to Cut the Cost of Dynamical Downscaling.
- Authors
Hobeichi, Sanaa; Nishant, Nidhi; Shao, Yawen; Abramowitz, Gab; Pitman, Andy; Sherwood, Steve; Bishop, Craig; Green, Samuel
- Abstract
Global climate models (GCMs) are commonly downscaled to understand future local climate change. The high computational cost of regional climate models (RCMs) limits how many GCMs can be dynamically downscaled, restricting uncertainty assessment. While statistical downscaling is cheaper, its validity in a changing climate is unclear. We combine these approaches to build an emulator leveraging the merits of dynamical and statistical downscaling. A machine learning model is developed for each coarse grid cell to predict fine grid variables, using coarse‐scale climate predictors with fine grid land characteristics. Two RCM emulators, one Multilayer Perceptron (MLP) and one Multiple Linear Regression error‐reduced with Random Forest (MLR‐RF), are developed to downscale daily evapotranspiration from 12.5 km (coarse‐scale) to 1.5 km (fine‐scale). Out‐of‐sample tests for the MLP and MLR‐RF achieve Kling‐Gupta‐Efficiency of 0.86 and 0.83, correlation of 0.89 and 0.86, and coefficient of determination (R2) of 0.78 and 0.75, with a relative bias of −6% to 5% and −5% to 4%, respectively. Using histogram match for spatial efficiency, both emulators achieve a median score of ∼0.77. This is generally better than a common statistical downscaling method in a range of metrics. Additionally, through "spatial transitivity," we can downscale GCMs for new regions at negligible cost and only minor performance loss. The framework offers a cheap and quick way to downscale large ensembles of GCMs. This could enable high‐resolution climate projections from a larger number of global models, enabling uncertainty quantification, and so better support for resilience and adaptation planning. Plain Language Summary: Climate models are used to try to predict what will happen as anthropogenic emissions of greenhouse gases change our climate over the next century. The information they produce is typically relevant at continental scales, but not at the local scale where we are most likely to actually experience the impact of a changing climate. To better understand how future climate will impact particular locations, information from global scale climate models needs to be "downscaled." That is, converted to a spatial scale that is relevant for that location. This process uses an immense amount of computing power and time, and this limits how many different climate models can be downscaled. This in turn affects our ability to understand how uncertain particular projected changes are. In this paper, we combine this traditional downscaling approach with machine learning so that a collection of statistical models can emulate the downscaling. This technique markedly reduces the computing power required to downscale a global climate model, allowing us to downscale more global models with existing resources. In turn, this gives us a better understanding of the uncertainty associated with particular changes, ultimately providing better decision‐making support for resilience and adaptation planning. Key Points: We successfully implement a hybrid dynamical‐machine learning downscaling approachThis markedly cuts downscaling computational costs, allowing larger downscaled ensembles of century‐scale projections with existing resourcesThe approach can downscale new regions without dynamical downscaling at negligible cost through spatial transitivity
- Subjects
DOWNSCALING (Climatology); CLIMATE change models; MACHINE learning; ATMOSPHERIC models; COST control
- Publication
Earth's Future, 2023, Vol 11, Issue 3, p1
- ISSN
2328-4277
- Publication type
Article
- DOI
10.1029/2022EF003291