Simulated sensitivity time series and model performance in three German catchments

DOI

The data sets contains the major results of the article “Improving information extraction from model data using sensitivity-weighted performance criteria“ written by Guse et al. (2020). In this article, it is analysed how a sensitivity-weighted performance criterion improves parameter identifiability and model performance. More details are given the in article.

The files of this dataset are described as follows.

Parameter sampling: FAST parameter sampling.xlsx:
To estimate the sensitivity, the Fourier Amplitude Sensitivity Test (FAST) was used (R-routine FAST, Reusser, 2013). Each column shows the values of the model parameter of the SWAT model (Arnold et al., 1998). All parameters are explained in detail in Neitsch et al. (2011). The FAST parameter sampling defines the number of model runs. For twelve model parameters as in this case, 579 model runs are required. The same parameter sets were used for all catchments.

Daily sensitivity time series: Sensitivity_2000_2005.xlsx:
Daily time series of parameter sensitivity for the period 2000-2005 for three catchments in Germany (Treene, Saale, Kinzig). Each column shows the sensitivity of one parameter of the SWAT model. The methodological approach of the temporal dynamics of parameter sensitivity (TEDPAS) was developed by Reusser et al. (2011) and firstly applied to the SWAT model in Guse et al. (2014). As sensitivity index, the first-order partial variance is used that is the ratio of the partial variance of one parameter divided by the total variance. The sensitivity is thus always between 0 and 1. The sum in one row, i.e. the sensitivity of all model parameters on one day, could not be higher than 1.

Parameter sampling: LH parameter sampling.xlsx:
To calculate parameter identifiability, Latin Hypercube sampling was used to generate 2000 parameter sets (R-package FME, Soetaert and Petzoldt, 2010). Each column shows the values of the model parameter of the SWAT model (Arnold et al., 1998). All parameters are explained in detail in Neitsch et al. (2011). The same parameter sets were used for all catchments.

Performance criteria with and without sensitivity weights: RSR_RSRw_cal.xlsx:
• Calculation of the RSR once and RSRw separately for each model parameter.
• RSR: Typical RSR (RMSE divided by standard deviation)
• RSR_w: RSR with weights according to daily sensitivity time series.

The calculation was carried out in all three catchments.
• The column RSR shows the results of the RSR (RMSE divided by standard deviation) for the different model runs.
• The column RSR[_parameter name] shows the calculation of the RSR_w for the specific model parameter.
• RSR_w give weights on each day based on the daily parameter sensitivity (as shown in sensitivity_2000_2005.xlsx). This means that days with a higher parameter sensitivity are higher weighted.

In the methodological approach the best 25% of the model runs were calculated (best 500 model runs) and the model parameters were constrained to the most appropriate parameter values (see methodological description in the article).

Performance criteria for the three catchments: GOFrun_[catchment name]_RSR.xlsx:
These three tables are organised identical and are available for the three catchments in Germany (Treene, Saale, Kinzig). In using the different parameter ranges for the catchments as defined in the previous steps, 2000 model simulation were carried out. Therefore, a Latin-Hypercube sampling was used (R-package FME, Soetaert and Petzoldt, 2010). The three tables show the results of 2000 model simulations for ten different performance criteria for the two different methodological approaches (RSR and swRSR) and two periods (calibration: 2000-2005 and validation: 2006-2010).

Performance criteria for the three catchments: GOFrun_[catchment name]_MAE.xlsx:
The three tables show the results of 2000 model simulations for ten different performance criteria for the two different methodological approaches (MAE and swMAE) and two periods (calibration: 2000-2005 and validation: 2006-2010).

Identifier
DOI https://doi.org/10.5880/GFZ.4.4.2019.004
Related Identifier https://doi.org/10.1111/j.1752-1688.1998.tb05961.x
Related Identifier https://doi.org/10.1002/hyp.9777
Related Identifier https://swat.tamu.edu/media/99192/swat2009-theory.pdf
Related Identifier https://doi.org/10.1029/2010WR009947
Related Identifier http://CRAN.R-project.org/package=fast
Related Identifier https://doi.org/10.18637/jss.v033.i03
Related Identifier https://doi.org/10.1029/2019WR025605
Metadata Access http://doidb.wdc-terra.org/oaip/oai?verb=GetRecord&metadataPrefix=oai_datacite&identifier=oai:doidb.wdc-terra.org:6896
Provenance
Creator Guse, Björn ORCID logo; Pfannerstill, Matthias ORCID logo; Kiesel, Jens ORCID logo; Strauch, Michael ORCID logo; Volk, Martin ORCID logo; Gupta, Hoshin; Fohrer, Nicola ORCID logo
Publisher GFZ Data Services
Contributor Guse, Björn
Publication Year 2020
Funding Reference Deutsche Forschungsgemeinschaft, GU 1466 1 1
Rights CC BY 4.0; http://creativecommons.org/licenses/by/4.0/
OpenAccess true
Contact Guse, Björn
Representation
Resource Type Dataset
Discipline Geosciences
Spatial Coverage (9.241W, 54.503S, 9.681E, 54.749N); Treene catchment; Saale catchment; Kinzig catchment