Downscaling

Last updated

Downscaling is any procedure to infer high-resolution information from low-resolution variables. This technique is based on dynamical or statistical approaches commonly used in several disciplines, especially meteorology, climatology and remote sensing. [1] [2] The term downscaling usually refers to an increase in spatial resolution, but it is often also used for temporal resolution. [3] [4] This is not to be confused with image downscaling which is a process of reducing an image from a higher resolution to a lower resolution.

Contents

Meteorology and climatology

Global Climate Models (GCMs) used for climate studies and climate projections are typically run at spatial resolutions of the order of 150 to 200 km [5] and are limited in their ability to resolve important sub-grid scale features such as convection clouds and topography. As a result, GCM based projections may not be robust for local impact studies.

To overcome this problem, downscaling methods are developed to obtain local-scale weather and climate, particularly at the surface level, from regional-scale atmospheric variables that are provided by GCMs. Two main forms of downscaling technique exist. One form is dynamical downscaling, where output from the GCM is used to drive a regional, numerical model in higher spatial resolution, which therefore is able to simulate local conditions in greater detail. The other form is statistical downscaling, where a statistical relationship is established from observations between large scale variables, like atmospheric surface pressure, and a local variable, like the wind speed at a particular site. The relationship is then subsequently used on the GCM data to obtain the local variables from the GCM output.

Wilby and Wigley placed meteorological downscaling techniques into four categories: [6] regression methods, weather pattern-based approaches, stochastic weather generators, which are all statistical downscaling methods, and limited-area modeling (which corresponds to dynamical downscaling methods). Among these approaches regression methods are preferred because of their relative ease of implementation and low computation requirements. Additionally, a semi-mechanistic downscaling approach can be applied as for example used for the CHELSA data of downscaled model output. In this example, the temperature algorithm is based on statistical downscaling and the precipitation algorithm incorporates orographic predictors with a subsequent bias correction. [7]

Examples

In 2007 the U.S. Bureau of Reclamation collaborated with U.S. Department of Energy's National Energy Technology Laboratory (DOE NETL), Santa Clara University (SCU), Lawrence Livermore National Laboratory (LLNL), and University of California's Institute for Research on Climate Change and Its Societal Impacts (IRCCSI) to apply a proven technique called “Bias Correction Spatial Disaggregation” BCSD; [8] see also “About on the Web site” to 112 contemporary global climate projections made available through the World Climate Research Program Couple Model Intercomparison Project, Phase 3 (WCRP CMIP3). These projections represent 16 GCMs simulating climate responses to three GHG scenarios from multiple initial climate system conditions.

The effort resulted in development of 112 monthly temperature and precipitation projections over the continental U.S. at 1/8° (12 kilometres (7.5 mi)) spatial resolution during a 1950–2099 climate simulation period.

CORDEX

The Coordinated Regional Downscaling Experiment (CORDEX) was initiated in 2009 with the objective of providing a framework for the evaluation and comparison of downscaling model performance, as well as define a set of experiments to produce climate projections for use in impact and adaptation studies. [9] [10] CORDEX climate change experiments are driven by the WCRP CMIP5 [11] GCM outputs. CORDEX defined 14 downscaling regions or domains.

Related Research Articles

<span class="mw-page-title-main">Climate</span> Statistics of weather conditions in a given region over long periods

Climate is the long-term weather pattern in a region, typically averaged over 30 years. More rigorously, it is the mean and variability of meteorological variables over a time spanning from months to millions of years. Some of the meteorological variables that are commonly measured are temperature, humidity, atmospheric pressure, wind, and precipitation. In a broader sense, climate is the state of the components of the climate system, including the atmosphere, hydrosphere, cryosphere, lithosphere and biosphere and the interactions between them. The climate of a location is affected by its latitude, longitude, terrain, altitude, land use and nearby water bodies and their currents.

<span class="mw-page-title-main">Climate model</span> Quantitative methods used to simulate climate

Numerical climate models use quantitative methods to simulate the interactions of the important drivers of climate, including atmosphere, oceans, land surface and ice. They are used for a variety of purposes from study of the dynamics of the climate system to projections of future climate. Climate models may also be qualitative models and also narratives, largely descriptive, of possible futures.

<span class="mw-page-title-main">Climatology</span> Scientific study of climate, defined as weather conditions averaged over a period of time

Climatology or climate science is the scientific study of Earth's climate, typically defined as weather conditions averaged over a period of at least 30 years. Climate concerns the atmospheric condition during an extended to indefinite period of time; weather is the condition of the atmosphere during a relative brief period of time. The main topics of research are the study of climate variability, mechanisms of climate changes and modern climate change. This topic of study is regarded as part of the atmospheric sciences and a subdivision of physical geography, which is one of the Earth sciences. Climatology includes some aspects of oceanography and biogeochemistry.

<span class="mw-page-title-main">General circulation model</span> Type of climate model

A general circulation model (GCM) is a type of climate model. It employs a mathematical model of the general circulation of a planetary atmosphere or ocean. It uses the Navier–Stokes equations on a rotating sphere with thermodynamic terms for various energy sources. These equations are the basis for computer programs used to simulate the Earth's atmosphere or oceans. Atmospheric and oceanic GCMs are key components along with sea ice and land-surface components.

<span class="mw-page-title-main">Numerical weather prediction</span> Weather prediction using mathematical models of the atmosphere and oceans

Numerical weather prediction (NWP) uses mathematical models of the atmosphere and oceans to predict the weather based on current weather conditions. Though first attempted in the 1920s, it was not until the advent of computer simulation in the 1950s that numerical weather predictions produced realistic results. A number of global and regional forecast models are run in different countries worldwide, using current weather observations relayed from radiosondes, weather satellites and other observing systems as inputs.

In climatology, the Coupled Model Intercomparison Project (CMIP) is a collaborative framework designed to improve knowledge of climate change. It was organized in 1995 by the Working Group on Coupled Modelling (WGCM) of the World Climate Research Programme (WCRP). It is developed in phases to foster the climate model improvements but also to support national and international assessments of climate change. A related project is the Atmospheric Model Intercomparison Project (AMIP) for global coupled ocean-atmosphere general circulation models (GCMs).

<span class="mw-page-title-main">Ensemble forecasting</span> Multiple simulation method for weather forecasting

Ensemble forecasting is a method used in or within numerical weather prediction. Instead of making a single forecast of the most likely weather, a set of forecasts is produced. This set of forecasts aims to give an indication of the range of possible future states of the atmosphere. Ensemble forecasting is a form of Monte Carlo analysis. The multiple simulations are conducted to account for the two usual sources of uncertainty in forecast models: (1) the errors introduced by the use of imperfect initial conditions, amplified by the chaotic nature of the evolution equations of the atmosphere, which is often referred to as sensitive dependence on initial conditions; and (2) errors introduced because of imperfections in the model formulation, such as the approximate mathematical methods to solve the equations. Ideally, the verified future atmospheric state should fall within the predicted ensemble spread, and the amount of spread should be related to the uncertainty (error) of the forecast. In general, this approach can be used to make probabilistic forecasts of any dynamical system, and not just for weather prediction.

Data assimilation is a mathematical discipline that seeks to optimally combine theory with observations. There may be a number of different goals sought – for example, to determine the optimal state estimate of a system, to determine initial conditions for a numerical forecast model, to interpolate sparse observation data using knowledge of the system being observed, to set numerical parameters based on training a model from observed data. Depending on the goal, different solution methods may be used. Data assimilation is distinguished from other forms of machine learning, image analysis, and statistical methods in that it utilizes a dynamical model of the system being analyzed.

<span class="mw-page-title-main">Atmospheric model</span> Mathematical model of atmospheric motions

In atmospheric science, an atmospheric model is a mathematical model constructed around the full set of primitive, dynamical equations which govern atmospheric motions. It can supplement these equations with parameterizations for turbulent diffusion, radiation, moist processes, heat exchange, soil, vegetation, surface water, the kinematic effects of terrain, and convection. Most atmospheric models are numerical, i.e. they discretize equations of motion. They can predict microscale phenomena such as tornadoes and boundary layer eddies, sub-microscale turbulent flow over buildings, as well as synoptic and global flows. The horizontal domain of a model is either global, covering the entire Earth, or regional (limited-area), covering only part of the Earth. The different types of models run are thermotropic, barotropic, hydrostatic, and nonhydrostatic. Some of the model types make assumptions about the atmosphere which lengthens the time steps used and increases computational speed.

Geophysical Fluid Dynamics Laboratory Coupled Model is a coupled atmosphere–ocean general circulation model (AOGCM) developed at the NOAA Geophysical Fluid Dynamics Laboratory in the United States. It is one of the leading climate models used in the Fourth Assessment Report of the IPCC, along with models developed at the Max Planck Institute for Climate Research, the Hadley Centre and the National Center for Atmospheric Research.

Tom Michael Lampe Wigley is a climate scientist at the University of Adelaide. He is also affiliated with the University Corporation for Atmospheric Research. He was named a fellow of the American Association for the Advancement of Science (AAAS) for his major contributions to climate and carbon cycle modeling and to climate data analysis, and because he is "one of the world's foremost experts on climate change and one of the most highly cited scientists in the discipline." His Web of Science h-index is 75, and his Google Scholar h-index is 114. He has contributed to many of the reports published by the Intergovernmental Panel on Climate Change (IPCC), a body that was recognized in 2007 by the joint award of the 2007 Nobel Peace Prize.

An atmospheric reanalysis is a meteorological and climate data assimilation project which aims to assimilate historical atmospheric observational data spanning an extended period, using a single consistent assimilation scheme throughout.

RheinBlick2050 is an environmental science research project on the impacts of regional climate change on discharge of the Rhine River and its major tributaries in Central Europe. The project runtime was from January 2008 until September 2010, initiated by and coordinated on behalf of the International Commission for the Hydrology of the Rhine Basin (CHR).

Pedometric mapping, or statistical soil mapping, is data-driven generation of soil property and class maps that is based on use of statistical methods. Its main objectives are to predict values of some soil variable at unobserved locations, and to access the uncertainty of that estimate using statistical inference i.e. statistically optimal approaches. From the application point of view, its main objective is to accurately predict response of a soil-plant ecosystem to various soil management strategies—that is, to generate maps of soil properties and soil classes that can be used for other environmental models and decision-making. It is largely based on applying geostatistics in soil science, and other statistical methods used in pedometrics.

Ocean general circulation models (OGCMs) are a particular kind of general circulation model to describe physical and thermodynamical processes in oceans. The oceanic general circulation is defined as the horizontal space scale and time scale larger than mesoscale. They depict oceans using a three-dimensional grid that include active thermodynamics and hence are most directly applicable to climate studies. They are the most advanced tools currently available for simulating the response of the global ocean system to increasing greenhouse gas concentrations. A hierarchy of OGCMs have been developed that include varying degrees of spatial coverage, resolution, geographical realism, process detail, etc.

The field of complex networks has emerged as an important area of science to generate novel insights into nature of complex systems The application of network theory to climate science is a young and emerging field. To identify and analyze patterns in global climate, scientists model climate data as complex networks.

Earth systems models of intermediate complexity (EMICs) form an important class of climate models, primarily used to investigate the earth's systems on long timescales or at reduced computational cost. This is mostly achieved through operation at lower temporal and spatial resolution than more comprehensive general circulation models (GCMs). Due to the nonlinear relationship between spatial resolution and model run-speed, modest reductions in resolution can lead to large improvements in model run-speed. This has historically allowed the inclusion of previously unincorporated earth-systems such as ice sheets and carbon cycle feedbacks. These benefits are conventionally understood to come at the cost of some model accuracy. However, the degree to which higher resolution models improve accuracy rather than simply precision is contested.

The Fisheries and Marine Ecosystem Model Intercomparison Project (Fish-MIP) is a marine biology project to compare computer models of the impact of climate change on sea life. Founded in 2013 as part of the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP), it was established to answer questions about the future of marine biodiversity, seafood supply, fisheries, and marine ecosystem functioning in the context of various climate change scenarios. It combines diverse marine ecosystem models from both the global and regional scale through a standardized protocol for ensemble modelling in an attempt to correct for any bias in the individual models that make up the ensemble. Fish-MIP's goal is to use this ensemble modelling to project a more robust picture of the future state of fisheries and marine ecosystems under the impacts of climate change, and ultimately to help inform fishing policy.

Atmospheric correction for Interferometric Synthetic ApertureRadar (InSAR) technique is a set of different methods to remove artefact displacement from an interferogram caused by the effect of weather variables such as humidity, temperature, and pressure. An interferogram is generated by processing two synthetic-aperture radar images before and after a geophysical event like an earthquake. Corrections for atmospheric variations are an important stage of InSAR data processing in many study areas to measure surface displacement because relative humidity differences of 20% can cause inaccuracies of 10–14 cm InSAR due to varying delays in the radar signal. Overall, atmospheric correction methods can be divided into two categories: a) Using Atmospheric Phase Screen (APS) statistical properties and b) Using auxiliary (external) data such as GPS measurements, multi-spectral observations, local meteorological models, and global atmospheric models.

Volker Wulfmeyer is a German physicist, meteorologist, climate and earth system researcher, university professor, and member of the Heidelberg Academy of Sciences.

References

Notes
  1. Ribalaygua, J.; Torres, L.; Pórtoles, J.; Monjo, R.; Gaitan, E.; Pino, M.R. (2013). "Description and validation of a two-step analogue/regression downscaling method". Theoretical and Applied Climatology. 114 (1–2): 253–269. Bibcode:2013ThApC.114..253R. doi:10.1007/s00704-013-0836-x. S2CID   52253427.
  2. Peng, J.; Loew, A.; Merlin, O.; Verhoest, N.E.C. (2017). "A review of spatial downscaling of satellite remotely sensed soil moisture". Reviews of Geophysics. 55 (2): 341. Bibcode:2017RvGeo..55..341P. doi:10.1002/2016RG000543. hdl: 11858/00-001M-0000-002D-3843-0 . S2CID   73579104.
  3. Lee, T.; Jeong, C. (2014). "Nonparametric statistical temporal downscaling of daily precipitation to hourly precipitation and implications for climate change scenarios". Journal of Hydrology. 510: 182–196. Bibcode:2014JHyd..510..182L. doi:10.1016/j.jhydrol.2013.12.027.
  4. Monjo, R. (2016). "Measure of rainfall time structure using the dimensionless n-index". Climate Research. 67 (1): 71–86. Bibcode:2016ClRes..67...71M. doi: 10.3354/cr01359 . (pdf)
  5. Change, Intergovernmental Panel on Climate (March 2014). "Evaluation of Climate Models". In Intergovernmental Panel On Climate Change (ed.). Climate Change 2013 - the Physical Science Basis (PDF). pp. 741–866. doi:10.1017/cbo9781107415324.020. ISBN   9781107415324 . Retrieved 2019-08-06.{{cite book}}: |website= ignored (help)
  6. Wilby, R.L.; Wigley, T.M.L. (1997). "Downscaling general circulation model output: a review of methods and limitations". Progress in Physical Geography. 21 (4): 530–548. doi:10.1177/030913339702100403. S2CID   18058016.
  7. Karger, D.N.; Conrad, O.; Böhner, J.; Kawohl, T.; Kreft, H.; Soria-Auza, R.W.; Zimmermann, N.E.; Linder, P.; Kessler, M. (2017). "Climatologies at high resolution for the Earth land surface areas". Scientific Data. 4 (170122): 170122. Bibcode:2017NatSD...470122K. doi:10.1038/sdata.2017.122. PMC   5584396 . PMID   28872642.
  8. Wood, A. W.; Leung, L. R.; Sridhar, V.; Lettenmaier, D. P. (2004-01-01). "Hydrologic Implications of Dynamical and Statistical Approaches to Downscaling Climate Model Outputs". Climatic Change. 62 (1–3): 189–216. doi:10.1023/B:CLIM.0000013685.99609.9e. ISSN   0165-0009. S2CID   27377984.
  9. "CAB Direct". www.cabdirect.org. Retrieved 2019-08-06.
  10. Gutowski Jr., William J.; Giorgi, Filippo; Timbal, Bertrand; Frigon, Anne; Jacob, Daniela; Kang, Hyun-Suk; Raghavan, Krishnan; Lee, Boram; Lennard, Christopher (2016-11-17). "WCRP COordinated Regional Downscaling EXperiment (CORDEX): a diagnostic MIP for CMIP6". Geoscientific Model Development. 9 (11): 4087–4095. Bibcode:2016GMD.....9.4087G. doi: 10.5194/gmd-9-4087-2016 . hdl: 11336/29500 . ISSN   1991-9603.
  11. Taylor, Karl E.; Stouffer, Ronald J.; Meehl, Gerald A. (2011-10-07). "An Overview of CMIP5 and the Experiment Design". Bulletin of the American Meteorological Society. 93 (4): 485–498. doi: 10.1175/BAMS-D-11-00094.1 . ISSN   0003-0007.