Proactive Quality Control based on Ensemble Forecast Sensitivity to Observations

dc.contributor.advisorKalnay, Eugeniaen_US
dc.contributor.authorHotta, Daisukeen_US
dc.contributor.departmentApplied Mathematics and Scientific Computationen_US
dc.contributor.publisherDigital Repository at the University of Marylanden_US
dc.contributor.publisherUniversity of Maryland (College Park, Md.)en_US
dc.date.accessioned2014-10-11T05:53:22Z
dc.date.available2014-10-11T05:53:22Z
dc.date.issued2014en_US
dc.description.abstractDespite recent major improvements in numerical weather prediction (NWP) systems, operational NWP forecasts occasionally suffer from an abrupt drop in forecast skill, a phenomenon called "forecast skill dropout." Recent studies have shown that the "dropouts" occur not because of the model's deficiencies but by the use of flawed observations that the operational quality control (QC) system failed to filter out. Thus, to minimize the occurrences of forecast skill dropouts, we need to detect and remove such flawed observations. A diagnostic technique called Ensemble Forecast Sensitivity to Observations (EFSO) enables us to quantify how much each observation has improved or degraded the forecast. A recent study (Ota et al., 2013) has shown that it is possible to detect flawed observations that caused regional forecast skill dropouts by using EFSO with 24-hour lead-time and that the forecast can be improved by not assimilating the detected observations. Inspired by their success, in the first part of this study, we propose a new QC method, which we call Proactive QC (PQC), in which flawed observations are detected 6 hours after the analysis by EFSO and then the analysis and forecast are repeated without using the detected observations. This new QC technique is implemented and tested on a lower-resolution version of NCEP's operational global NWP system. The results we obtained are extremely promising; we have found that we can detect regional forecast skill dropouts and the flawed observations after only 6 hours from the analysis and that the rejection of the identified flawed observations indeed improves 24-hour forecasts. In the second part, we show that the same approximation used in the derivation of EFSO can be used to formulate the forecast sensitivity to observation error covariance matrix <bold>R</bold>, which we call EFSR. We implement the EFSR diagnostics in both an idealized system and the quasi-operational NWP system and show that it can be used to tune the <bold>R</bold> matrix so that the utility of observations is improved. We also point out that EFSO and EFSR can be used for the optimal assimilation of new observing systems.en_US
dc.identifierhttps://doi.org/10.13016/M23W3P
dc.identifier.urihttp://hdl.handle.net/1903/15794
dc.language.isoenen_US
dc.subject.pqcontrolledApplied mathematicsen_US
dc.subject.pqcontrolledAtmospheric sciencesen_US
dc.subject.pquncontrolledData Assimilationen_US
dc.subject.pquncontrolledEnsemble Kalman Filteren_US
dc.subject.pquncontrolledForecast Sensitivityen_US
dc.subject.pquncontrolledLorenz N-cycleen_US
dc.subject.pquncontrolledNumerical Weather Predictionen_US
dc.subject.pquncontrolledQuality Controlen_US
dc.titleProactive Quality Control based on Ensemble Forecast Sensitivity to Observationsen_US
dc.typeDissertationen_US

Files

Original bundle

Now showing 1 - 1 of 1
Thumbnail Image
Name:
Hotta_umd_0117E_15536.pdf
Size:
16.23 MB
Format:
Adobe Portable Document Format