Search engine for discovering works of Art, research articles, and books related to Art and Culture
ShareThis
Javascript must be enabled to continue!

Multilevel Monte Carlo estimation of background error covariances in ensemble variational data assimilation

View through CrossRef
<p>In ensemble variational (EnVar) data assimilation systems, background error covariances are sampled from an ensemble of forecasts evolving with time. One possible way of generating this ensemble is by running an Ensemble of Data Assimilations (EDA) that samples all possible error sources (initial condition errors, boundary condition errors, model errors). Large ensemble sizes are desirable to minimize sampling errors, but generating a single ensemble member is usually expensive due to the cost of integrating the physical model. In practice, ensembles with coarser spatial resolutions are sometimes used, allowing for cheaper generation of individual members, and thus larger ensemble sizes.</p><p>Multilevel Monte Carlo (MLMC) methods propose to go beyond this usual trade-off between grid resolution and ensemble size, by expressing a fine-grid estimator as an astute combination of estimators computed on a hierarchy of spatial grids. Starting from a Monte Carlo covariance estimator on a coarse grid but with a large ensemble size, correction terms are added to form a quasi-telescopic sum. The correction terms come from EDAs of increasing spatial resolutions and decreasing ensemble sizes, with a pairwise stochastic coupling between EDAs of two successive resolutions. The expectation of this MLMC estimator is equal to the expectation of the Monte Carlo estimator on the finest grid, so that no bias is introduced by the coarse resolution forecasts. Without increasing the computational cost, MLMC effectively reduces the variance of the covariance estimator, <em>i.e.</em> reduces the sampling noise on covariances.</p><p>We first present the theoretical basis of MLMC and how it can apply to the estimation of covariance matrices. An illustration with a quasi-geostrophic model is then presented. For a given computational budget, we compare three equal-cost methods to estimate background error covariances: (1) the usual single-resolution ensemble estimate, (2) a combination of estimates of various resolutions based on Bayesian Model Averaging and (3) the MLMC estimate. The methods are compared in terms of mean square error of the covariance estimators, and in terms of quality of the resulting analyses for one assimilation cycle. The role of covariance localization in each case is also briefly discussed.</p><p><em>This w</em><em>ork is partially supported by 3IA Artificial and Natural Intelligence Toulouse Institute, French "Investing for the Future- PIA3" program under the Grant agreement ANR-19-PI3A-0004.</em></p><p><em>This project has received financial support from the CNRS through the 80Prime program.<br></em></p>
Title: Multilevel Monte Carlo estimation of background error covariances in ensemble variational data assimilation
Description:
<p>In ensemble variational (EnVar) data assimilation systems, background error covariances are sampled from an ensemble of forecasts evolving with time.
One possible way of generating this ensemble is by running an Ensemble of Data Assimilations (EDA) that samples all possible error sources (initial condition errors, boundary condition errors, model errors).
Large ensemble sizes are desirable to minimize sampling errors, but generating a single ensemble member is usually expensive due to the cost of integrating the physical model.
In practice, ensembles with coarser spatial resolutions are sometimes used, allowing for cheaper generation of individual members, and thus larger ensemble sizes.
</p><p>Multilevel Monte Carlo (MLMC) methods propose to go beyond this usual trade-off between grid resolution and ensemble size, by expressing a fine-grid estimator as an astute combination of estimators computed on a hierarchy of spatial grids.
Starting from a Monte Carlo covariance estimator on a coarse grid but with a large ensemble size, correction terms are added to form a quasi-telescopic sum.
The correction terms come from EDAs of increasing spatial resolutions and decreasing ensemble sizes, with a pairwise stochastic coupling between EDAs of two successive resolutions.
The expectation of this MLMC estimator is equal to the expectation of the Monte Carlo estimator on the finest grid, so that no bias is introduced by the coarse resolution forecasts.
Without increasing the computational cost, MLMC effectively reduces the variance of the covariance estimator, <em>i.
e.
</em> reduces the sampling noise on covariances.
</p><p>We first present the theoretical basis of MLMC and how it can apply to the estimation of covariance matrices.
An illustration with a quasi-geostrophic model is then presented.
For a given computational budget, we compare three equal-cost methods to estimate background error covariances: (1) the usual single-resolution ensemble estimate, (2) a combination of estimates of various resolutions based on Bayesian Model Averaging and (3) the MLMC estimate.
The methods are compared in terms of mean square error of the covariance estimators, and in terms of quality of the resulting analyses for one assimilation cycle.
The role of covariance localization in each case is also briefly discussed.
</p><p><em>This w</em><em>ork is partially supported by 3IA Artificial and Natural Intelligence Toulouse Institute, French "Investing for the Future- PIA3" program under the Grant agreement ANR-19-PI3A-0004.
</em></p><p><em>This project has received financial support from the CNRS through the 80Prime program.
<br></em></p>.

Related Results

Multilevel Monte Carlo methods for ensemble variational data assimilation
Multilevel Monte Carlo methods for ensemble variational data assimilation
Abstract. Ensemble variational data assimilation relies on ensembles of forecasts to estimate the background error covariance matrix B. The ensemble can be provided by an ensemble ...
Monte Carlo methods: barrier option pricing with stable Greeks and multilevel Monte Carlo learning
Monte Carlo methods: barrier option pricing with stable Greeks and multilevel Monte Carlo learning
For discretely observed barrier options, there exists no closed solution under the Black-Scholes model. Thus, it is often helpful to use Monte Carlo simulations, which are easily a...
Efficient ensemble data assimilation for coupled models with the Parallel Data Assimilation Framework: example of AWI-CM (AWI-CM-PDAF 1.0)
Efficient ensemble data assimilation for coupled models with the Parallel Data Assimilation Framework: example of AWI-CM (AWI-CM-PDAF 1.0)
Abstract. Data assimilation integrates information from observational measurements with numerical models. When used with coupled models of Earth system compartments, e.g., the atmo...
Research on Multi-Group Monte Carlo Calculations Based on Group Constants Generated by RMC
Research on Multi-Group Monte Carlo Calculations Based on Group Constants Generated by RMC
Abstract Nowadays, deterministic two-step or Monte Carlo methods are commonly used in core physics calculations. However, with the development of reactor core design, tradi...
Deep Learning for Latent Space Data Assimilation LSDA in Subsurface Flow Systems
Deep Learning for Latent Space Data Assimilation LSDA in Subsurface Flow Systems
Abstract We present a deep learning architecture for efficient reduced-order implementation of ensemble data assimilation. Specifically, deep learning is used to imp...
The characteristics of Hessian singular vectors using an advanced data assimilation scheme
The characteristics of Hessian singular vectors using an advanced data assimilation scheme
AbstractInitial condition uncertainty is a significant source of forecast error in numerical weather prediction. Singular vectors of the tangent linear propagator can identify dire...
On Flores Island, do "ape-men" still exist? https://www.sapiens.org/biology/flores-island-ape-men/
On Flores Island, do "ape-men" still exist? https://www.sapiens.org/biology/flores-island-ape-men/
<span style="font-size:11pt"><span style="background:#f9f9f4"><span style="line-height:normal"><span style="font-family:Calibri,sans-serif"><b><spa...

Back to Top