Combining Physics-based Modeling, Machine Learning, and Data Assimilation for Forecasting Large, Complex, Spatiotemporally Chaotic Systems

Loading...
Thumbnail Image

Files

Publication or External Link

Date

2023

Citation

Abstract

We consider the challenging problem of forecasting high-dimensional, spatiotemporally chaotic systems. We are primarily interested in the problem of forecasting the dynamics of the earth's atmosphere and oceans, where one seeks forecasts that (a) accurately reproduce the true system trajectory in the short-term, as desired in weather forecasting, and that (b) correctly capture the long-term ergodic properties of the true system, as desired in climate modeling. We aim to leverage two types of information in making our forecasts: incomplete scientific knowledge in the form of an imperfect forecast model, and past observations of the true system state that may be sparse and/or noisy. In this thesis, we ask if machine learning (ML) and data assimilation (DA) can be used to combine observational information with a physical knowledge-based forecast model to produce accurate short-term forecasts and consistent long-term climate dynamics.

We first describe and demonstrate a technique called Combined Hybrid-Parallel Prediction (CHyPP) that combines a global knowledge-based model with a parallel ML architecture consisting of many reservoir computers and trained using complete observations of the system's past evolution. Using the Kuramoto-Sivashinsky equation as our test model, we demonstrate that this technique produces more accurate short-term forecasts than either the knowledge-based or the ML component model acting alone and is scalable to large spatial domains. We further demonstrate using the multi-scale Lorenz Model 3 that CHyPP can incorporate the effect of unresolved short-scale dynamics (subgrid-scale closure).

We next demonstrate how DA, in the form of the Ensemble Transform Kalman Filter (ETKF), can be used to extend the Hybrid ML approach to the case where our system observations are sparse and noisy. Using a novel iterative scheme, we show that DA can be used to obtain training data for successive generations of hybrid ML models, improving the forecast accuracy and the estimate of the full system state over that obtained using the imperfect knowledge-based model.

Finally, we explore the commonly used technique of adding observational noise to the ML model input during training to improve long-term stability and climate replication. We develop a novel training technique, Linearized Multi-Noise Training (LMNT), that approximates the effect of this noise addition. We demonstrate that reservoir computers trained with noise or LMNT regularization are stable and replicate the true system climate, while LMNT allows for greater ease of regularization parameter tuning when using reservoir computers.

Notes

Rights