The PRUDENCE data server is a collection of climate model output from the PRUDENCE project. The data are now public with a few exceptions. You should familiarize yourself with the ideas behind the project and with the model simulation strategy in the description of work of the project.
Please inform us if you use data from this archive for your research. And please include an acknowledgment in any publication. If you have problems or find errors in the data sets, please tell us.
The acknowledgment should be something like: "Data have been provided through the PRUDENCE data archive, funded by the EU through contract EVK2-CT2001-00132."
The data can be accessed either through direct download of netCDF files or through DODS where it is possible to extract subareas and subperiods to minimize download time. You may want to experiment with looking at small files like the monthly data or seasonal means first.
Short descriptions of the models involved in PRUDENCE can be found here.
Please consult the overview table which is a list of experiments and acronyms or this more compact table to find out how to identify the experiments.
Any updates to the data sets will be documented here. Please note the file md5sums in each directory containing checksums for each file so that you can ascertain the integrity of your downloaded data set with the md5sum program.
The definitions of the regional-model grids can be found in the area description page.
All experiments deliver daily averages of most of the following 18 fields:
|t2m||2-meter temperature (K)|
|clcov||Total cloudiness (Fraction)|
|snow||Snow water equivalent (mm)|
|runoff||Total runoff (mm/d)|
|soilw||Soil moisture (mm)|
|Psurf||Surface pressure (hPa)|
|MSLP||Mean sea level pressure (hPa)|
|t2max||Daily maximum 2-meter temperature (K)|
|t2min||Daily minimum 2-meter temperature (K)|
|w10m||10-meter wind speed (average length of the wind vector) (m/s)|
|w10max||10-meter daily maximum wind speed (m/s)|
|q2m||2-meter specific humidity (kg/kg)|
|SWnet||Net SW radiation (W/m^2) positive|
|SWdown||Downward SW radiation (W/m^2) positive|
|LWnet||Net LW radiation (W/m^2) positive|
|LWdown||Downward LW radiation (W/m^2) positive downward|
Alternatives wrt. moisture for some institutes:
|rh2m||2-meter relative humidity (Fraction)|
|td2m||2-meter dew point temperature (K)|
Theory and reality, as experience shows, are different. Some of these fields are absent from some experiments. But most fields are there for all experiments! Some data sets comprise 30 years (1961-90 and 2071-2100), others comprise 31 years (1960-90 and 2070-2100)! Note that all years have 360 days.
All fields are also present as monthly means on their native grids, i.e., 360 or 372 values for the 30 or 31 years of the daily data set. Furthermore, these monthly data sets are also present as interpolated to a common grid (the CRU grid). Finally, total seasonal averages and standard deviations have been made from these interpolated CRU-grid data. From CNRM we have obtained monthly and seasonal fields from several experiments outside of Prudence. We will still only have daily data for the Prudence experiments, though.
The interpolations have been done as follows:
The main source of information about netCDF is the netCDF homepage http://wiht.link/NetCDF-resources (Note 2017/01/01: This link has been updated. Use this to get netcdf information, as the other links in this paragraph are also outdated). There you can find the source for the netCDF library, information and links to software that can use netCDF, conventions for writing netCDF files and a discussion group (for CF-conventions) where you can pose questions, make suggestions and so forth.
For a short overview you may check out the following web page http://climate.uvic.ca/people/jgregory/netcdf.html
Within PRUDENCE we have agreed on following the CF (Climate and Forecast) conventions.
All PRUDENCE data files are accessible in two ways: Direct download, and processing with DODS (Note 2017/01/01: This has been superseded by the new OpenDAP link here. . As gateway to direct download please follow this link. To access the data through DODS, please go here. In order to access the HC HadAM3H and HadRM3H data, please go here. DODS access to these data is unfortunately not possible.
Seasonal and monthly data are rather small files, which you are very welcome to download directly if you want to. But the daily data are large, several hundred MB per field per experiment. Therefore, unless you need the entire area and the entire time period, please take advantage of DODS. This way you can limit the downloaded data to the points, periods, and frequencies that are relevant to you!
Using DODS you can pick subareas by choosing starting and ending points and stride for each coordinate dimension, i.e.,
pseudo-longitude, pseudo-latitude, and time. Data can be downloaded as ASCII, which is a bit awkward. But they can also be downloaded as DODS data, and perhaps even more usefully,
you will be provided with an URL that can be used in place of a filename in DODS-aware applications. In other words, when I have installed the
DODS-aware version of ncview, dncview, I can plot the winter temperature of the HIRHAM control simulation with the command
Please consult the DODS home page about obtaining DODS-aware client applications, as this will make your life a lot easier, especially if you are interested in the PRUDENCE data sets. Note that you can create netCDF files of the sub-areas of interest in the following two-step process using dncdump and ncgen:
dncdump http://ensemblesrt3.dmi.dk:8080/opendap/data/prudence/t2m.mean.DMI.HC1.nc?longitude[0:1:99],latitude[0:1:79],height[0:1:0],time[0:1:0],time_bnds[0:1:0][0:1:1],t2m[0:1:0][0:1:0][0:1:79][0:1:99] > local.dat ncgen -b -o local.nc local.dat
Many software links are provided on the netCDF homepage. For Unix platforms you may especially check out the browser "Ncview" and the manipulating software "NCO" which allows you to perform easily arithmetics on the netCDF file and change attributes. Ferret is recommended for displaying with more options than ncview.
The program "ncdump" used here to dump the header information of e.g. the file "clcov.DMI.HC1.nc" is included in the netCDF library package.
As you can see in the header dump page the header information can be divided into three main parts (dimensions, variables and global attributes). Each variable has a set of attributes. It may seem a bit strange that dimension names are the same as some variable names. The reason for this is that software reading netCDF data may identify the axis in this way.
The dimensions are in C language style. Fortran programmers should read and define the variables in their programs from right to left. For example t2m(time, rlat, rlon) in C is t2m(rlon,rlat,time) in Fortran.