0% found this document useful (0 votes)
22 views109 pages

2016 Cosmo Tutorial

This document provides instructions for installing and running the COSMO-Model numerical weather prediction system. It describes how to install required libraries and compile the model code. It also explains how to prepare input data, run forecasts, and visualize output data using tools like GrADS. Sections include running idealized test cases and troubleshooting.

Uploaded by

abebek
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views109 pages

2016 Cosmo Tutorial

This document provides instructions for installing and running the COSMO-Model numerical weather prediction system. It describes how to install required libraries and compile the model code. It also explains how to prepare input data, run forecasts, and visualize output data using tools like GrADS. Sections include running idealized test cases and troubleshooting.

Uploaded by

abebek
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 109

COSMO-Model Tutorial

February 2016

Working with the COSMO-Model

Practical Exercises

for NWP Mode, RCM Mode, COSMO-ART,


and Coupling the Community Land Model

For NWP Mode: Ulrich Schättler, Ulrich Blahak, Michael Baldauf,


Alexander Smalla,
Rodica Dumitrache, Amalia Iriza
For RCM Mode: Susanne Brienen, Kristina Trusilova, Burkhardt
Rockel, Merja Tölle, Andreas Will, Andrew Ferrone,
Christian Steger
For COSMO-ART: Bernhard Vogel, Jochen Förstner, Carolin Walter,
Konrad Deetz, Tobias Schad, Heike Vogel
i

Contents

1 Installation of the COSMO-Model Package 1


1.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1.1 The COSMO-Model Package . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1.2 External Libraries needed by the COSMO-Model Package . . . . . . . . . . . 2
1.1.3 Computer Platforms for the COSMO-Model Package . . . . . . . . . . . . . . 4
1.1.4 Necessary Data to run the System . . . . . . . . . . . . . . . . . . . . . . . . 4
1.1.5 Practical Informations for these Exercises . . . . . . . . . . . . . . . . . . . . 5
1.2 Installation of the External Libraries . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.2.1 DWD GRIB1 Library . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.2.2 ECMWF GRIB-API . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.2.3 NetCDF . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
1.3 Installation of the Programs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
1.3.1 The source code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
1.3.2 Adaptation of Fopts and the Makefile . . . . . . . . . . . . . . . . . . . . . . 9
1.3.3 Compiling and linking the binary . . . . . . . . . . . . . . . . . . . . . . . . . 10
1.4 Installation of the reference-data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
1.5 Installation of the starter package for the climate mode . . . . . . . . . . . . . . . . . 11
1.6 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

2 Preparing External, Initial and Boundary Data 13


2.1 Namelist Input for the INT2LM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.2 External Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.3 The Model Domain . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.3.1 The Horizontal Grid . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.3.2 Choosing the Vertical Grid . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.3.3 The Reference Atmosphere . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
CONTENTS ii

2.3.4 Summary of Namelist Group LMGRID . . . . . . . . . . . . . . . . . . . . . . . 16

2.4 Coarse Grid Model Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

2.4.1 Possible Driving Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

2.4.2 Specifying Regular Grids . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

2.4.3 Specifying the GME Grid . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

2.4.4 Specifying the ICON Grid . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

2.5 Specifying Data Characteristics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

2.5.1 External Parameters for the COSMO-Model . . . . . . . . . . . . . . . . . . . 20

2.5.2 External Parameters for the Driving Model . . . . . . . . . . . . . . . . . . . 20

2.5.3 Input Data from the Driving Model . . . . . . . . . . . . . . . . . . . . . . . 20

2.5.4 COSMO Grid Output Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

2.6 Parameters to Control the INT2LM Run . . . . . . . . . . . . . . . . . . . . . . . . . 22

2.6.1 Forecast Time, Domain Decomposition and Driving Model . . . . . . . . . . . 22

2.6.2 Basic Control Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

2.6.3 Main Switch to choose NWP- or RCM-Mode . . . . . . . . . . . . . . . . . . 23

2.6.4 Additional Switches Interesting for the RCM-Mode . . . . . . . . . . . . . . . 23

2.7 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

2.7.1 NWP-Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

2.7.2 RCM-Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

3 Running the COSMO-Model 27

3.1 Namelist Input for the COSMO-Model . . . . . . . . . . . . . . . . . . . . . . . . . . 27

3.1.1 The Model Domain . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

3.1.2 Basic Control Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

3.1.3 Settings for the Dynamics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

3.1.4 Settings for the Physics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

3.1.5 Settings for the I/O . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

3.1.6 Settings for the Diagnostics . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

3.2 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

3.2.1 NWP-Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

3.2.2 RCM-Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

CONTENTS
CONTENTS iii

4 Visualizing COSMO-Model Output using GrADS 33

4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

4.2 The data descriptor file . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

4.3 The gribmap program . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

4.4 A Sample GrADS Session . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

4.4.1 Drawing a Map Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

4.4.2 Use of GrADS in Batch mode . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

4.5 Helpful Tools: wgrib and grib2ctl . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

4.5.1 wgrib . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

4.5.2 grib2ctl . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46

4.5.3 Exercise . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46

5 Postprocessing and visualization of NetCDF files 47

5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

5.2 ncdump . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

5.3 CDO and NCO . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

5.4 ncview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

5.5 Further tools for visualization and manipulation of NetCDF data . . . . . . . . . . . 49

5.6 ETOOL . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

5.7 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

6 Running Idealized Test Cases 51

6.1 General remarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

6.1.1 Visualization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

6.1.2 gnuplot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

6.2 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

7 Troubleshooting for the COSMO-Model 57

7.1 Compiling and Linking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57

7.2 Troubleshooting in NWP-Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58

7.2.1 Running INT2LM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58

7.2.2 Running the COSMO-Model . . . . . . . . . . . . . . . . . . . . . . . . . . . 59

7.3 Troubleshooting in RCM-Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60

CONTENTS
CONTENTS iv

8 Running COSMO-ART 61

8.1 General remarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61

8.2 Source Code and Installation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61

8.3 Exercises at DWD . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68

8.3.1 Dispersion of idealized area emissions . . . . . . . . . . . . . . . . . . . . . . . 68

8.3.2 Dispersion of volcanic ash . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68

8.3.3 Emission and dispersion of natural aerosols . . . . . . . . . . . . . . . . . . . 69

8.3.4 Dispersion of anthropogenic aerosols and gases and formation of secondary


aerosol particles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69

8.3.5 Full model chain of COSMO-ART and INT2LM-ART . . . . . . . . . . . . . 69

8.4 INT2LM-ART . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70

Appendix A The Computer System at DWD 73

A.1 Available Plattforms at DWD . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73

A.2 The Batch System for the Cray . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74

A.3 Cray Run-Scripts for the Programs . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74

A.3.1 Scripts for COSMO-model real case simulations and INT2LM . . . . . . . . . 74

A.3.2 Scripts for idealized COSMO-model simulations . . . . . . . . . . . . . . . . . 79

Appendix B The Computer System at DKRZ 83

B.1 Available Plattforms at DKRZ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83

B.2 The Batch System for the mistral . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83

B.3 Script environment for running the COSMO-CLM in a chain . . . . . . . . . . . . . 84

B.3.1 The scripts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84

B.3.2 The environment variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86

B.3.3 Extensions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87

B.3.4 Running the CCLM chain . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88

B.3.5 The directory structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88

B.3.6 Remarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88

Appendix C Necessary Files and Data for the NWP Mode 91

C.1 Availability of Source Codes and Data . . . . . . . . . . . . . . . . . . . . . . . . . . 91

C.2 External Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92

C.3 Driving Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96

CONTENTS
CONTENTS v

Appendix D Necessary Files and Data for the Climate Mode 97

D.1 The Source Codes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98

D.2 External Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98

D.3 Initial and Boundary Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98

D.4 Preprocessor and model system overview . . . . . . . . . . . . . . . . . . . . . . . . . 99

CONTENTS
CONTENTS vi

CONTENTS
1

Chapter 1

Installation of the COSMO-Model


Package

1.1 Introduction

1.1.1 The COSMO-Model Package

The COSMO-Model Package is a regional numerical weather prediction system. It is based on the
COSMO-Model, a nonhydrostatic limited-area atmospheric prediction model. Besides the forecast
model itself, a number of additional components such as data assimilation, interpolation of boundary
conditions from the driving model, and pre- and postprocessing utilities are required to run the
COSMO-Model in numerical weather prediction-mode (NWP-mode), in climate mode (RCM-mode)
or for idealized case studies. Since some time it is also possible to run an online coupled module
for aerosols and reactive trace gases together with the COSMO-Model (COSMO-ART). The source
code for the COSMO-Model Package consists of the following two programs:

• The interpolation program INT2LM


The INT2LM interpolates data from different data sets to the rotated latitude-longitude grid
of the COSMO-Model. Thus it provides the initial and / or boundary data necessary to run
the COSMO-Model. Data from the global models ICON (ICOsahedral, Non-hydrostatic: new
global model of DWD since January 2015), GME (icosahedral hydrostatic: old global model
of DWD), IFS (spectral model of ECMWF) and the regional COSMO-Model itself can be
processed directly. For the climate mode, processing of a common data format from global
data sets has been implemented. Thus, data from ECHAM5, ERA15, ERA40, NCEP and
some other models can be processed. An extra pre-pre-processor is needed to convert these
data to the common data format.

• The nonhydrostatic limited-area COSMO-Model


The COSMO-Model is a nonhydrostatic limited-area atmospheric prediction model. It has
been designed for both operational numerical weather prediction (NWP) and various scientific
applications on the meso-β and meso-γ scale. Between 2002 and 2006, a climate mode has
been developed and implemented, which also allows for long-term simulations and regional
climate calculations. The NWP-mode and the climate mode share the same source code. With
every major release, the diverging developments are unified again. This code basis is also used
to couple COSMO-ART to the model. But note, that COSMO-ART requires additional source
code, which is provided by the Karlsruhe Institute for Technology (KIT).
The COSMO-Model is based on the primitive thermo-hydrodynamical equations describing

COSMO-Model Tutorial CHAPTER 1. INSTALLATION OF THE COSMO-MODEL PACKAGE


2 1.1 Introduction

compressible flow in a moist atmosphere. The model equations are formulated in rotated
geographical coordinates and a generalized terrain-following height coordinate. A variety of
physical processes are taken into account by parameterization schemes. Also included in the
model is a continuous data assimilation scheme for the atmospheric variables based on the
nudging method.

The purpose of this tutorial is to give you some practical experience in installing and running the
COSMO-Model Package. Exercises can be carried out on the supercomputers at DWD and / or
DKRZ, resp. The principal steps of the installation can directly be transferred to other systems.

1.1.2 External Libraries needed by the COSMO-Model Package

There are some tasks, for which the COSMO-Model package needs external libraries. All libraries
are implemented with appropriate ifdefs and their usage can be avoided. Note, that if a special
ifdef is not set, the corresponding tasks cannot be performed by the programs. The tasks, the
associated libraries and the ifdef-names are listed below.
Data Input and Output
Two data formats are implemented in the package to read and write data from or to disk: GRIB
and NetCDF.

• GRIB (GRIdded Binary) is a standard defined by the World Meteorological Organization


(WMO) for the exchange of processed data in the form of grid point values expressed in binary
form. GRIB coded data consist of a continuous bit-stream made of a sequence of octets (1
octet = 8 bits). There are 2 versions of GRIB, which are actually used at the moment. GRIB1
exists since the end of 1990ies and still is used at most weather services. GRIB2 is the new
upcoming standard, which is only used at a few centers up to now, among them the DWD.
The COSMO-Model can work with both GRIB versions, but needs different libraries.

• NetCDF (Network Common Data Form) is a set of software libraries and machine-independent
data formats that support the creation, access, and sharing of array-oriented scientific data.
NetCDF files contain the complete information about the dependent variables, the history
and the fields themselves.
For more information on NetCDF see http://www.unidata.ucar.edu.

To work with the formats described above the following libraries are implemented in the COSMO-
Model package:

• The DWD GRIB1 library libgrib1.a


DWD provides a library to pack and unpack data using the GRIB1 format. This library also
containes the C-routines for performing the input and output for the programs. It cannot
work with the new GRIB2 standard.
To activate the interface routines to this library and to link it, the programs have to be
compiled with the preprocessor directive GRIBDWD.

• The ECMWF GRIB_API libraries libgrib_api.a, libgrib_api_f90.a


ECMWF lately developed an application programmers interface (API) to pack and unpack
GRIB1 as well as GRIB2 format, the grib_api. It has been implemented and can be used
now for GRIB2 format from version INT2LM 2.01 and COSMO-Model 5.03 on. Please note
that using GRIB2 still is experimental and not yet supported operationally outside DWD.

CHAPTER 1. INSTALLATION OF THE COSMO-MODEL PACKAGE COSMO-Model Tutorial


1.1 Introduction 3

Only the reading of GRIB2 files by INT2LM is already possible in operational mode.
To activate grib_api and to link the two corresponding libraries, the programs have to be
compiled with the preprocessor directive GRIBAPI.
• The NetCDF library libnetcdf.a
A special library, the NetCDF-library, is necessary to write and read data using the NetCDF
format. This library also contains routines for manipulating and visualizing the data (nco-
routines). Additionally, CDO-routines written at the Max-Planck-Institute for Meteorology in
Hamburg may be used. These routines provide additional features like the harmonics or the
vorticity of the wind field.
To activate the NetCDF interface routines and to link the corresponding library, the programs
have to be compiled with the preprocessor directive NETCDF.

Observation Processing
In data assimilation mode it is necessary to process observation data, which is also done using
NetCDF files. Therefore, the libnetcdf.a is also necessary for running the nudging with observation
processing. To activate the nudging, the COSMO-Model has to be compiled with the preprocessor
directive NUDGING. If this directive is not specified, the source code for the data assimilation mode
is not compiled. To run the nudging is not possible then.
Synthetic Satellite Images
Since Version 3.7 the COSMO-Model contains an interface to the RTTOV7-library (Radiative Trans-
fer Model). This interface has been developed at the DLR Institute for Atmospheric Physics in
Oberpfaffenhofen. Together with the RTTOV7-library it is possible to compute synthetic satellite
images (brightness temperatures and radiances) derived from model variables for various instru-
ments from the satellites Meteosat5-7 and Meteosat Second Generation.
Later, this interface has been expanded to work with newer RTTOV versions, RTTOV9 and RT-
TOV10. The RTTOV libraries are developed and maintained by UKMO et al. in the framework of
the ESA NWP-SAF. They are modified at DWD to be used in parallel programs. To use the RTTOV-
libraries, a license is necessary. For getting this license, please contact [email protected].

• RTTOV7: librttov7.a
To activate the interface for RTTOV7, the model has to be compiled with the preprocessor
directive RTTOV7.
• RTTOV9: librttov_ifc.a, librttov9.3.a, librttov9.3_parallel.a
In Version 4.18 also an interface to the newer RTTOV9-library has been installed. Besides the
model-specific interface there is an additional model-independent interface, mo_rttov_ifc,
which is also available from an external library librttov_ifc.a. For using RTTOV9, two
additional external libraries librttov9.3_parallel.a and librttov9.3.a are also necessary.
To activate the interfaces for RTTOV9, the model has to be compiled with the preprocessor
directive RTTOV9.
• RTTOV10: librttov10.2.a, libradiance10.2.a, libhrit.a
COSMO-Model 5.0 also contains the interface for the RTTOV10 libraries. To activate the in-
terfaces for RTTOV10, the model has to be compiled with the preprocessor directive RTTOV10.
Since COSMO-Model 5.03 on, a modified version of the libradiance10.2.a plus an additional
library libhrit.a is necessary.

Note that only one of these libraries can be activated. If none of the directives RTTOV7, RTTOV9
or RTTOV10 is specified, the RTTOV-libraries are not necessary. The computation of the synthetic
satellite images is not possible then.

COSMO-Model Tutorial CHAPTER 1. INSTALLATION OF THE COSMO-MODEL PACKAGE


4 1.1 Introduction

1.1.3 Computer Platforms for the COSMO-Model Package

For practical work, the libraries and programs have to be installed on a computer system. You
should have some knowledge about Unix / Linux, Compilers and Makefiles for these exercises.

INT2LM and the COSMO-Model are implemented for distributed memory parallel computers using
the Message Passing Interface (MPI), but can also be installed on sequential computers, where MPI
is not available. For that purpose, a file dummy_mpi.f90 has to be compiled and linked with the
programs. A Makefile is provided with the source codes, where the compiler call, the options and
the necessary libraries can be specified.

All the codes have already been ported to different computer platforms and may easily be adapted
to other systems. If you want to port the libraries and programs, you have to adapt the compiler
and linker options in the Makefiles (see Sections 1.2.1 and 1.3.2) and the machine dependent parts
in the run-scripts (see Appendices A-3 and / or B-3, resp.)

1.1.4 Necessary Data to run the System

Besides the source codes of the COSMO-package and the libraries, data are needed to perform runs
of the COSMO-Model. There are three categories of necessary data:

External Parameter Files

External parameters are used to describe the surface of the earth. These data include the orography
and the land-sea-mask. Also, several parameters are needed to specify the dominant land use of a
grid box like the soil type or the plant cover.

For fixed domain sizes and resolutions some external parameter files for the COSMO-Model are
available. For the NWP community these data are usually provided in GRIB-format, while the
CLM-Community prefers the NetCDF format.

External parameter data sets can be generated for any domain on the earth. Depending on the raw
data sets available up to now, the highest resolution possible is about 2 km (0.02 degrees). At DWD
there is a tool EXTPAR for creating the external parameters. The CLM-Community has a slightly
different tool PEP (Preparation of External Parameters), which can use additional raw data sets.
More information on this issue can be found in the COSMO-Model documentation.

Boundary Conditions

Because the COSMO-Model is a regional system, it needs boundary data from a driving model. For
the NWP-mode it is possible to use the ICON (or old GME) and the IFS as driving models. Also,
the COSMO-Model data can be used as boundary data for a higher resolution COSMO run. The
boundary data are interpolated to the COSMO-Model grid with the INT2LM.

For the climate mode, several driving models are possible. A pre-pre-processor transforms their out-
put into a common format, which can be processed by the INT2LM. See additional documentation
from the CLM-Community for this mode.

CHAPTER 1. INSTALLATION OF THE COSMO-MODEL PACKAGE COSMO-Model Tutorial


1.2 Installation of the External Libraries 5

Initial Conditions

In an operational NWP-mode, initial conditions for the COSMO-Model are produced by Data
Assimilation. Included in the COSMO-Model is the nudging procedure, that nudges the model
state towards available observations. For normal test runs and case studies, the initial data can also
be derived from the driving model using the INT2LM. In general, this is also the way it is done for
climate simulations since data assimilation is not used then.

1.1.5 Practical Informations for these Exercises

For practical work during these exercises, you need informations about the use of the computer
systems and from where you can access the necessary files and data. Depending on whether you run
the NWP- or the RCM-mode, these informations are differing. Please take a look at the following
Appendices:

A: The Computer System at DWD


To get informations on how to use DWD’s supercomputer and run test jobs using the NWP-
runscripts.

B: The Computer System at DKRZ


To get informations on how to use DKRZ’s supercomputer and run test jobs using the RCM-
runscripts.

C: Necessary Files and Data for the NWP-Mode


To get informations from where you can access the source code and the data for the NWP-
exercises.

D: Necessary Files and Data for the RCM-Mode


To get informations from where you can access the source code and the data for the RCM-
exercises.

1.2 Installation of the External Libraries

1.2.1 DWD GRIB1 Library

Introduction

The GRIB1 library has two major tasks:

• First, it has to pack the meteorological fields into GRIB1 code and vice versa. Because GRIB1
coded data consists of a continuous bit stream, this library has to deal with the manipulation
of bits and bytes, which is not straightforward in Fortran.

• The second task is the reading and writing of these bit streams. The corresponding routines
are written in C, because this usually has a better performance than Fortran-I/O.

The bit manipulation and the Fortran-C interface (which is not standardized across different plat-
forms) makes the GRIB1 library a rather critical part of the COSMO-Model Package, because its
implementation is machine dependent.

COSMO-Model Tutorial CHAPTER 1. INSTALLATION OF THE COSMO-MODEL PACKAGE


6 1.2 Installation of the External Libraries

Source code of the DWD grib library

The source code of the grib library and a Makefile are contained in the compressed tar-file
DWD-libgrib1_<date>.tar.gz (or .bz2). By unpacking (tar xvf DWD-libgrib1_<date>.tar) a
directory DWD-libgrib1_<date> is created that contains a README and subdirectories include
and source. In the include-subdirectory several header files are contained, that are included dur-
ing the compilation to the source files. The source-subdirectory contains the source code and a
Makefile.
For a correct treatment of the Fortran-C-Interface and for properly working with bits and bytes, the
Makefile has to be adapted to the machine used. For the compiler-calls and -flags, several variables
are defined which can be commented / uncommented as needed. For the changes in the source code,
implementations are provided for some machines with ifdefs. If there is no suitable definition /
implementation, this has to be added.

Adaptation of the Makefile

In the Makefile, the following variables and commands have to be adjusted according to your needs:

LIBPATH Directory, to which the library is written (this directory again is needed in the
Makefiles of INT2LM and COSMO-Model).

INCDIR ../include.

AR This is the standard Unix archiving utility. If compiling and archiving on one
computer, but for another platform (like working on a front-end and compiling
code for a NEC SX machine), this also should be adapted.

FTN Call to the Fortran compiler.

FCFLAGS Options for compiling the Fortran routines in fixed form. Here the -Dxxx must
be set.

F90FLAGS Options for compiling the Fortran routines in free form.

CC Call to the C-Compilers (should be cc most everywhere).

CCFLAGS Options for compiling the C routines.


Here the -Dxxx must be set to choose one of the ifdef variants.

Big Endian / Little Endian


Unix computers, which are available today, use so-called Big Endian processors, which means that
they number the bits in a byte from 0 to 7. Typical PC-processors (Intel et al.) do it the other way
round and number the bits from 7 to 0. Because of this, the byte-order has to be swapped when
using a PC or a Linux machine. This byte-swapping is implemented in the grib library and can be
switched on by defining -D__linux__.

Compiling and creating the library

make or also make all compiles the routines and creates the library libgrib1.a in LIBPATH. On
most machines you can also compile the routines in parallel by using the GNU-make with the
command gmake -j np, where np gives the number of processors to use (typically 8).

CHAPTER 1. INSTALLATION OF THE COSMO-MODEL PACKAGE COSMO-Model Tutorial


1.2 Installation of the External Libraries 7

Running programs using the DWD GRIB1 library

All programs that are doing GRIB1 I/O, have to be linked with the library libgrib1.a. There are
two additional issues that have to be considered when running programs using this GRIB1 library:

• DWD still uses a GRIB1 file format, where all records are starting and ending with additional
bytes, the so-called controlwords. An implementation of the GRIB1 library is prepared that
also deals with pure GRIB1 files, that do not have these controlwords. But still we guarantee
correct execution only, if these controlwords are used. To ensure this you have to set the
environment variable

export LIBDWD_FORCE_CONTROLWORDS=1.

in all your run-scripts.

• Another environment variable has to be set, if INT2LM is interpolating GME data that are
using ASCII bitmap files:

export LIBDWD_BITMAP_TYPE=ASCII.

(because this library can also deal with binary bitmap files, which is the default).

1.2.2 ECMWF GRIB-API

Introduction

Recently ECMWF provided a new library to pack and unpack GRIB data, the grib_api, which
can also be used for GRIB2. The approach, how data is coded to and decoded from GRIB messages
is rather different compared to DWD GRIB1 library. While the DWD GRIB1-library provides
interfaces to code and decode the full GRIB message in one step, the grib_api uses the so-called
key/value approach, where the single meta data could be set. In addition to these interface routines
(available for Fortran and for C), there are some command line tools to provide an easy way to
check and manipulate GRIB data from the shell.

For more information on grib_api we refer to the ECMWF web page:

https://software.ecmwf.int/wiki/display/GRIB/Home.

Installation

The source code for the grib_api can be downloaded from the ECMWF web page

https://software.ecmwf.int/wiki/display/GRIB/Releases.

Please refer to the README for installing the grib_api libraries, which is done with a configure-
script. Check the following settings:

• Installation directory: create the directory where to install grib_api:

mkdir /e/uhome/trngxyx/grib_api

COSMO-Model Tutorial CHAPTER 1. INSTALLATION OF THE COSMO-MODEL PACKAGE


8 1.3 Installation of the Programs

• If you do not use standard compilers (e.g. on DWD’s Cray system), you can either set environ-
ment variables or give special options to the configure-script. If you work on a standard linux
machine with the gcc compiler environment, you do not need to specify these environment
variables.

– export CC=/opt/cray/craype/2.4.2/bin/cc
– export CFLAGS=’-O2 -hflex_mp=conservative -h fp_trap’
– export LDFLAGS=’-K trap=fp’
– export F77=’/opt/cray/craype/2.4.2/bin/ftn’
– export FFLAGS=’-O2 -hflex_mp=conservative -h fp_trap’
– export FC=’/opt/cray/craype/2.4.2/bin/ftn’
– export FCFLAGS=’-O2 -hflex_mp=conservative -h fp_trap’

• grib_api can make use of optional jpeg-packing of the GRIB records, but this requires the
installation of additional packages. Because INT2LM and the COSMO-Model do not use these
optional packaging, use of jpeg can be disabled during the configure-step with the option
–disable-jpeg

• To use static linked libraries and binaries, you should give the configure option
–enable-shared=no.

./configure –prefix=/your/install/dir –disable-jpeg –enable-shared=no

After the configuration has finished, the grib_api library can be build with make and then make
install.
Note, that it is not necessary to set the environment variables mentioned above for the DWD GRIB1
library.

1.2.3 NetCDF

NetCDF is a freely available library for reading and writing scientific data sets. If the library is
not yet installed on your system, you can get the source code and documentation (this includes a
description how to install the library on different platforms) from
http://www.unidata.ucar.edu/software/netcdf/index.html
Please make sure that the F90 package is also installed, since the model reads and writes data
through the F90 NetCDF functions.
For this training course, the path to access the NetCDF library on the used computer system is
already included in the Fopts and / or Makefile.

1.3 Installation of the Programs

1.3.1 The source code

The files int2lm_yymmdd_x.y.tar.gz and cosmo_yymmdd_x.y.tar.gz (or .bz2) contain the source
codes of the INT2LM and the COSMO-Model, resp. yymmdd describes the date in the form "Year-
Month-Day" and x.y gives the version number as in the DWD Version Control System (VCS).

CHAPTER 1. INSTALLATION OF THE COSMO-MODEL PACKAGE COSMO-Model Tutorial


1.3 Installation of the Programs 9

Since between major model updates the code is developed in parallel by the DWD and the CLM-
Community, the code names are slightly different for COSMO-CLM where a number z is added:
int2lm_yymmdd_clmz and cosmo_yymmdd_x.y_clmz.
By unpacking (tar xvf int2lm_yymmdd_x.y.tar or tar xvf cosmo_yymmdd_x.y.tar) a directory
int2lm_yymmdd_x.y (or cosmo_yymmdd_x.y) is created which contains a Makefile together with
some associated files and (for the NWP Versions) some example run-scripts to set the Namelist
Parameters for certain configurations and to start the programs.

DOCS Contains a short documentation of the changes in version x.


Fopts Definition of the compiler options and also directories of libraries.
LOCAL Contains several examples of Fopts-files for different computers.
Makefile For compiling and linking the programs.
runxxx2yy Scripts to set the Namelist values for a configuration from model xxx to ap-
plication yy and start the program.
src Subdirectory for the source code.
obj Subdirectory where the object files are written.
Dependencies Definition of the dependencies between the different source files.
Objfiles Definition of the object files.
work Subdirectory for intermediate files.

Generally, only the Fopts part needs to be adapted by the user for different computer systems.

1.3.2 Adaptation of Fopts and the Makefile

In Fopts, the variables for compiler-calls and -options and the directories for the used libraries have
to be adapted. In the subdirectory LOCAL, some example Fopts-files are given for different hardware
platforms and compilers. If there is nothing suitable, the variables have to be defined as needed.

F90 Call to the Fortran (90) compiler.


LDPAR Call to the linker to create a parallel binary.
LDSEQ Call to the linker to create a sequential binary without using the Message
Passing Interface (MPI).
PROGRAM The binary can be given a special name.
COMFLG1
... Options for compiling the Fortran routines of the COSMO-Model.
COMFLG4
LDFLG Options for linking.
AR Name of the archiving utility. This usually is ar, but when using the NEC
cross compiler it is sxar.
LIB Directories and names of the used external libraries.

COSMO-Model Tutorial CHAPTER 1. INSTALLATION OF THE COSMO-MODEL PACKAGE


10 1.4 Installation of the reference-data

1.3.3 Compiling and linking the binary

With the Unix command make exe (or just make), the programs are compiled and all object files
are linked to create the binaries. On most machines you can also compile the routines in parallel by
using the GNU-make with the command gmake -j np, where np gives the number of processors to
use (typically 8).

1.4 Installation of the reference-data

The file reference_data_5.03.tar(.bz2) contains an ICON reference data set together with the
ASCII output, created on the Cray XC30 of DWD. Included are ICON output data from the 07th of
July, 2015, 12 UTC, for up to 12 hours. These data are only provided for a special region over central
Europe. By unpacking (tar xvf reference_data_5.03.tar) the following directory structure is
created:

COSMO_2_input Binary input data for the COSMO-Model (2.8 km resolution; only
selected data sets available for comparison with your results)

COSMO_2_output Binary output data from the COSMO-Model (2.8 km resolution;


only selected data sets available for comparison with your results)

cosmo_2_output_ascii ASCII reference output from the COSMO-Model (2.8 km resolu-


tion) together with a run-script run_cosmo_2.

COSMO_7_input Binary input data for the COSMO-Model (7 km resolution; only


selected data sets available for comparison with your results)

COSMO_7_output Binary output data from the COSMO-Model (7 km resolution; only


selected data sets available for comparison with your results)

cosmo_7_output_ascii ASCII reference output from the COSMO-Model (7 km resolution)


together with a run-script run_cosmo_7.

ICON_2015070712_sub Binary ICON GRIB2 input data for the INT2LM and necessary
ICON external parameter and grib files (partly in NetCDF format).

int2lm_2_output_ascii ASCII reference output from the INT2LM run that transforms the
COSMO-Model 7 km data to the 2.8 km grid together with a run-
script run_cosmo7_2_cosmo2.

int2lm_7_output_ascii ASCII reference output from the INT2LM run that transforms the
ICON data data to the 7 km COSMO-Model grid together with a
run-script run_icon_2_cosmo7.

README More information on the reference data set.

The characteristics of this data set are: Start at 07. July 2015, 12 UTC + 12 hours.

Application Resolution grid size startlat_tot startlon_tot


km / degrees
COSMO-7 7 / 0.0625 129×161×40 -5.0 -4.0
COSMO-2 2.8 / 0.025 121×141×50 -3.5 -3.5

CHAPTER 1. INSTALLATION OF THE COSMO-MODEL PACKAGE COSMO-Model Tutorial


1.5 Installation of the starter package for the climate mode 11

1.5 Installation of the starter package for the climate mode

For the climate simulation, all the necessary code and input files to start working with the COSMO-
CLM are included in the starter package which is available on the CLM-Community web page. After
unpacking the archive (tar -xzvf cclm-sp.tgz) you find the source code, initial and boundary
data, external parameters and run scripts as described in Appendix B and D. Note that the starter
package requires the netCDF4 library linked with HDF5 and zip libraries and extended by the
Fortran netCDF package. The netCDF4 package comes with the programs ncdump and nccopy. If
you are using the starter package on another computer than mistral make sure that these programs
are installed correctly1 .

The compilation of the libraries and binaries is very similar to that for the NWP mode (Sections
1.2–1.3). In this course, we need to compile the GRIB library and the INT2LM and COSMO-CLM
binaries. The NetCDF library is already available on the mistral computer. An additional package
of some small auxiliary fortran programs is needed which is called CFU.

In all cases you have to type ’make’ in the respective subdirectory to compile the codes. For INT2LM
and COSMO-CLM you have to adopt the Fopts file and set your working directory (SPDIR) first. An
initalization script called init.sh helps with these adoptions. It is therefore useful to run this script
once directly after unpacking the whole package. On computer systems other than the mistral at
DKRZ you have to be careful to make the necessary additional changes in the Fopts and Makefile.
Some of these options are already available in the files.

More information on the setup of the model can be found on the CLM-Community webpage. In
Model → Support, you can find the starter package, the WebPEP tool for generating external data
files, a namelist tool with detailed information on the namelist parameters of the model and links
to the bug reporting system and technical pages (RedC) and some other utilities.

1.6 Exercises

NWP-Mode:

EXERCISE:
Install the DWD GRIB1-library, the ECMWF grib_api, the INT2LM and the COSMO-
Model on your computer and make a test run using the reference data set. For that you
have to do:

• Copy the source files to your home directory and the reference data set to your
work- or scratch-directory (see Appendix C-1).

• Install the libraries and programs according to Sections 1.2.1, 1.2.2, 1.3 and 1.4.

• Run the INT2LM and the COSMO-Model for (at least) the 7 km resolution. Copy
the corresponding run-scripts to the directories where you installed INT2LM and
the COSMO-Model, resp.. You only have to change some directory names and the
machine you want to run on (hopefully). See Appendix A-3 for an explanation of
the run-scripts.

1
software available from http://www.unidata.ucar.edu/software/netcdf/index.html

COSMO-Model Tutorial CHAPTER 1. INSTALLATION OF THE COSMO-MODEL PACKAGE


12 1.6 Exercises

RCM-Mode:

EXERCISE:
Install the COSMO-CLM starter package on your computer and compile the necessary
programs. For that you have to do:

• Copy the starter package to your scratch-directory (see Appendix D).

• Install the library and programs according to Sections 1.2.1, 1.3 and 1.5.

CHAPTER 1. INSTALLATION OF THE COSMO-MODEL PACKAGE COSMO-Model Tutorial


13

Chapter 2

Preparing External, Initial and


Boundary Data

In this lesson you should prepare the initial and boundary data for a COSMO-Model domain of
your choice. You will learn how to specify some important Namelist variables for the INT2LM.

2.1 Namelist Input for the INT2LM

The execution of INT2LM can be controlled by 6 NAMELIST groups:

CONTRL parameters for the model run


GRID_IN specifying the domain and the size of the driving model grid
LMGRID specifying the domain and the size of the LM grid
DATA controlling the grib input and output
PRICTR controlling grid point output

All NAMELIST groups have to appear in the input file INPUT in the order given above. Default
values are set for all parameters, you only have to specify values, that have to be different from the
default. The NAMELIST variables can be specified by the user in the run-script for the INT2LM,
which then creates the INPUT file. See Appendices A.3 (NWP) and / or B.3 (RCM) for details on
the run scripts.

2.2 External Data

Before you can start a test run, you have to specify a domain for the COSMO-Model and get some
external parameters for this domain. How the domain is specified is explained in detail in Sect. 2.3.
Because of interpolation reasons the external parameter must contain at least one additional row of
grid points on all four sides of the model domain. The domain for the external parameters can be
larger than the domain for which the COSMO-Model will be run. The INT2LM cuts out the proper
domain then.
For the first exercises, you have to take one of the external data sets specified in Appendix C.2
(for NWP) or Appendix D.2 (for RCM). In the future, if you want to work with different domains,
you can contact DWD or the CLM-Community to provide you an external parameter set for your
domain.

COSMO-Model Tutorial CHAPTER 2. PREPARING EXTERNAL, INITIAL AND BOUNDARY DATA


14 2.3 The Model Domain

2.3 The Model Domain

This section explains the variables of the Namelist group LMGRID and gives some theoretical back-
ground.
To specify the domain of your choice, you have to choose the following parameters:

1. The lower left grid point of the domain in rotated coordinates.

2. The horizontal resolution in longitudinal and latitudinal direction in degrees.

3. The size of the domain in grid points.

4. The geographical coordinates of the rotated north pole, to specify the rotation
(or the angles of rotation for the domain).

5. The vertical resolution by specifying the vertical coordinate parameters.

All these parameters can be determined by Namelist variables of the Namelist group LMGRID. In the
following the variables and their meaning is explained.
Note: For describing the bigger domain of the external parameters, you have to specify values for
1) - 4) above. A vertical resolution cannot be given for the external parameters, because they are
purely horizontal.

2.3.1 The Horizontal Grid

The COSMO-Model uses a rotated spherical coordinate system, where the poles are moved and can
be positioned such that the equator runs through the centre of the model domain. Thus, problems
resulting from the convergence of the meridians can be minimized for any limited area model domain
on the globe.
Within the area of the chosen external parameter data set, you can specify any smaller area for your
tests. Please note that you have to choose at least one grid point less on each side of the domain of
the external parameters because of interpolation reasons.
The horizontal grid is specified by setting the following Namelist variables. Here we show an example
for a domain in central Europe.

startlat_tot = -17.0, lower left grid point in rotated coordinates


startlon_tot = -12.5,
dlon=0.0625, dlat=0.0625, horizontal resolution of the model
ielm_tot=225, jelm_tot=269, horizontal size of the model domain in grid points
pollat = 40.0, pollon = -170.0, geographical coordinates of rotated north pole
(or: angles of rotation for the domain)

2.3.2 Choosing the Vertical Grid

To specify the vertical grid, the number of the vertical levels must be given together with their
position in the atmosphere. This can be done by the Namelist variables kelm_tot and by defining the
list of vertical coordinate parameters σ1 , . . . , σke (which have the Namelist variable vcoord_d(:)).

CHAPTER 2. PREPARING EXTERNAL, INITIAL AND BOUNDARY DATA COSMO-Model Tutorial


2.3 The Model Domain 15

This is the most difficult part of setting up the model domain, because a certain knowledge about
the vertical grid is necessary. The COSMO-Model equations are formulated using a terrain-following
coordinate system with a generalized vertical coordinate. Please refer to the Scientific Documen-
tation (Part I: Dynamics and Numerics) for a full explanation of the vertical coordinate system.
For practical applications, three different options are offered by INT2LM to specify the vertical
coordinate. The options are chosen by the Namelist variable ivctype:

1. A pressure based coordinate (deprecated: should not be used any more)


The σ values are running from 0 (top of the atmosphere) to 1 (surface),
e.g. σ1 = 0.02, . . . , σke = 1.0.

2. A height based coordinate (standard):


The σ values are given in meters above sea level,
e.g. σ1 = 23580.4414, . . . , σke = 0.0.

3. A height based SLEVE (Smooth LEvel VErtical) coordinate:


The σ values are specified as in 2. In addition some more parameters are necessary. Please
refer to the documentation for a full specification of the SLEVE coordinate.

All these coordinates are hybrid coordinates, i.e. they are terrain following in the lower part of the
atmosphere and change back to a pure z-coordinate (flat horizontal lines with a fixed height above
mean sea level) at a certain height. This height is specified by the Namelist variable vcflat. vcflat
has to be specified according to the chosen vertical coordinate parameters (pressure based or height
based).
Most operational applications in extra-tropical areas now use 40 levels together with coarser hor-
izontal resolutions (up to 7 km) and 50 levels for very high horizontal resolution runs (about 2-3
km). A special setup with higher model domains is available for tropical areas, to better catch
high-reaching convection.
Summary of the Namelist variables regarding the vertical grid:

ivctype=2, to choose the type of the vertical coordinate


vcoord_d=σ1 , . . . , σke , list of vertical coordinate parameters to specify the
vertical grid structure
vcflat=11430.0, height, where levels change back to z-system
kelm_tot=40, vertical size of the model domain in grid points
Because the specification of the vertical coordinate parameters σk is not straightforward, INT2LM
offers pre-defined sets of coordinate parameters for ivctype=2 and kelm_tot=40 or kelm_tot=50.
Then the list of vertical coordinate parameters vcoord_d= σ1 , . . . , σke and vcflat are set by the
INT2LM.

2.3.3 The Reference Atmosphere

In the COSMO-Model, pressure p is defined as the sum of a base state (p0 ) and deviations from this
base state (p0 ). The base state (or reference state) p0 is prescribed to be horizontally homogeneous,
i.e. depending only on the height above the surface, is time invariant and hydrostatically balanced.
To define the base state, some other default values like sea surface pressure (pSL ) and sea surface
temperature (TSL ) are needed. With these specifications, the reference atmosphere for the COSMO-
Model is computed. For a full description of the reference atmosphere, please refer to the Scientific
Documentation (Part I: Dynamics and Numerics)

COSMO-Model Tutorial CHAPTER 2. PREPARING EXTERNAL, INITIAL AND BOUNDARY DATA


16 2.3 The Model Domain

The COSMO-Model offers 2 different types for the reference atmosphere.

1: For the first reference atmosphere a constant rate of increase in temperature with the loga-
rithm of pressure is assumed. Therefore, the reference atmosphere has a finite height and the
top of the model domain has to be positioned below this maximum value in order to avoid
unrealistical low reference temperatures in the vicinity of the upper boundary.
2: The second reference atmosphere is based on a temperature profile which allows a higher
model top:  
−z
t0 (z) = (t0sl − ∆t) + ∆t · exp ,
hscal
where z = hhl(k) is the height of a model grid point. If this option is used, also the values for
∆t = delta_t and hscal = h_scal have to be set.

Which type of reference atmosphere is used, is chosen by the Namelist variable irefatm = 1/2.
Some additional characteristics of the chosen atmosphere can also be set. For further details see
the User Guide for the INT2LM (Part V: Preprocessing, Sect. 7.3: Specifying the Domain and the
Model Grid).
The values of the reference atmosphere are calculated analytically on the model half levels. These are
the interfaces of the vertical model layers, for which most atmospheric variables are given. Because
the values are also needed on the full levels (i.e. the model layers), they also have to be computed
there. For the first reference atmosphere (irefatm=1) there are 2 different ways how to compute
the values on the full levels. This is controlled by the namelist switch lanalyt_calc_t0p0:

- lanalyt_calc_t0p0=.FALSE.
The values on the full levels are computed by averaging the values from the half levels. This
has to be set for the Leapfrog dynamical core (l2tls=.FALSE. in the COSMO-Model) or
for the old Runge-Kutta fast waves solver (l2tls=.TRUE. and (itype_fast_waves=1 in the
COSMO-Model).
- lanalyt_calc_t0p0=.TRUE.
The values on the full levels are also calculated analytically on the full levels. This has to be
set for the new Runge-Kutta fast waves solver (l2tls=.TRUE. and (itype_fast_waves=2 in
the COSMO-Model).

For irefatm=2 the values on the full levels are always calculated analytically.

2.3.4 Summary of Namelist Group LMGRID

Example of the Namelist variables in the group LMGRID, that are necessary for these exercises:
&LMGRID
startlat_tot = -17.0, startlon_tot = -12.5,
dlon=0.0625, dlat=0.0625,
ielm_tot=225, jelm_tot=269, kelm_tot=40,
pollat = 40.0, pollon = -170.0,
ivctype=2,
irefatm=2,
delta_t=75.0,
h_scal=10000.0,
/

CHAPTER 2. PREPARING EXTERNAL, INITIAL AND BOUNDARY DATA COSMO-Model Tutorial


2.4 Coarse Grid Model Data 17

Example for 50 vertical levels in extra-tropical regions:

vcoord_d= 22000.00, 21000.00, 20028.57, 19085.36, 18170.00, 17282.14, 16421.43,


15587.50, 14780.00, 13998.57, 13242.86, 12512.50, 11807.14, 11126.43,
10470.00, 9837.50, 9228.57, 8642.86, 8080.00, 7539.64, 7021.43,
6525.00, 6050.00, 5596.07, 5162.86, 4750.00, 4357.14, 3983.93,
3630.00, 3295.00, 2978.57, 2680.36, 2400.00, 2137.14, 1891.43,
1662.50, 1450.00, 1253.57, 1072.86, 907.50, 757.14, 621.43,
500.00, 392.50, 298.57, 217.86, 150.00, 94.64, 51.43,
20.00, 0.00,

Example for 50 vertical levels in tropical regions:

vcoord_d= 30000.00, 28574.09, 27198.21, 25870.74, 24590.12, 23354.87, 22163.61,


21014.99, 19907.74, 18840.66, 17812.60, 16822.44, 15869.14, 14951.68,
14069.12, 13220.53, 12405.03, 11621.78, 10869.96, 10148.82, 9457.59,
8795.59, 8162.12, 7556.52, 6978.19, 6426.50, 5900.89, 5400.80,
4925.71, 4475.11, 4048.50, 3645.43, 3265.45, 2908.13, 2573.08,
2259.90, 1968.23, 1697.72, 1448.06, 1218.94, 1010.07, 821.21,
652.12, 502.61, 372.52, 261.72, 170.16, 100.00, 50.00,
20.00, 0.00,

2.4 Coarse Grid Model Data

This section explains most variables of the Namelist group GRID_IN.

2.4.1 Possible Driving Models

With the INT2LM it is possible to interpolate data from several driving models to the COSMO
grid. Up to now the following input models are supported in NWP mode:

• ICON: New non-hydrostatic global model from DWD with an icosahedral model grid

• GME: Old hydrostatic global model from DWD with an icosahedral model grid

• IFS: Global spectral model from ECMWF; data is delivered on a Gaussian grid

• COSMO: Data from a coarser COSMO-Model run can be used to drive higher resolution runs

In the RCM mode, additional driving models are possible:

• NCEP: Data from the National Center for Environmental Prediction (USA)

• ERA40 or ERA-Interim: Data from the ECMWF Reanalysis

COSMO-Model Tutorial CHAPTER 2. PREPARING EXTERNAL, INITIAL AND BOUNDARY DATA


18 2.4 Coarse Grid Model Data

• ECHAM5/6: Data from the global model of MPI Hamburg


• HADAM: Data from the Hadley Centre, (UK)
• REMO: Data from the regional climate model REMO, MPI Hamburg.
• several CMIP5 GCMs: CGCM3, HadGEM, CNRM, EC-EARTH, Miroc5, CanESM2

The data of these models are pre-pre-processed, to have the data in a common format, which can
be processed by the INT2LM.

2.4.2 Specifying Regular Grids

The Namelist group GRID_IN specifies the characteristics of the grid of the driving model chosen.
Most input models use a rectangular grid which can be specified by the following Namelist variables:

startlat_in_tot latitude of lower left grid point of total input domain


startlon_in_tot longitude of lower left grid point of total input domain
dlon_in longitudinal resolution of the input grid
dlat_in latitudinal resolution of the input grid
ie_in_tot longitudinal grid point size
je_in_tot latitudinal grid point size
ke_in_tot vertical grid point size
pollat_in geographical latitude of north pole (if applicable; or angle of rotation)
pollon_in geographical longitude of north pole (if applicable; or angle of rotation)

2.4.3 Specifying the GME Grid

The GME has a special horizontal grid derived from the icosahedron. The characteristics of the
GME grid are specified with the variables:

ni_gme= ni stands for number of intersections of a great circle between two points of
the icosahedron and determines the horizontal resolution.
i3e_gme= vertical levels of GME
In the last years the resolution of GME has been increased several times. Depending on the date
you are processing, the above Namelist variables have to be specified according to the following
table:

Date ni_gme i3e_gme


before 02.02.2010, 12 UTC 192 40
02.02.2010, 12 UTC - 29.02.2012, 00 UTC 256 60
29.02.2012, 12 UTC - 20.01.2014, 00 UTC 384 60

With the GME data (from all resolutions), you can do an interpolation to a COSMO grid resolution
of about 7 km, NOT MORE. If you want to run a higher resolution, you have to choose COSMO-EU

CHAPTER 2. PREPARING EXTERNAL, INITIAL AND BOUNDARY DATA COSMO-Model Tutorial


2.4 Coarse Grid Model Data 19

data or you first have to run a 7 km domain yourself and then go on to a finer resolution (which
would be a nice exercise).

Working with GME bitmap data

DWD offers the possibility, not to provide the full GME data sets (which are very large) but only
the subset that is needed to run the COSMO-Model for a special domain. To specify the grid points
from GME, for which data are needed, bitmaps are used. If such GME data sets, that were created
with a bitmap, are used, the same bitmap has to be provided to INT2LM, in order to put all data
to the correct grid point.

The way, how bitmaps are used, and their format differs between using DWD GRIB1 library or the
ECMWF grib_api. For DWD GRIB1 library, external ASCII bitmaps are used. To provide the
same bitmap also to INT2LM, the two NAMELIST parameters ybitmap_cat and ybitmap_lfn
in the group DATA (see Sect. 2.5.3) have to be specified. Moreover, the environment variable
LIBDWD_BITMAP_TYPE=ASCII has to be set. ECMWF grib_api works with internal bitmaps. Noth-
ing special has to be set here.

2.4.4 Specifying the ICON Grid

The ICON horizontal grid is rather similar to the GME grid, but it is implemented as an unstruc-
tured grid. This technical issue, but also the algorithms used to construct the grid, make the grid
generation a very expensive process, which cannot be done during INT2LM runs. In order to process
ICON data, it is necessary to load precalculated horizontal grid information from an external file, a
so-called grid file. A grid file is provided either for the whole globe or for a special COSMO-Model
domain.

The parameters that have to be specified for running with ICON data are:

yicon_grid_cat directory of the file describing the horizontal ICON grid.


yicon_grid_lfn name of the file describing the horizontal ICON grid.
ke_in_tot ke (vertical dimension) for input grid.
nlevskip number of levels that should be skipped for the input model.

At the moment DWD runs a resolution of about 13 km for the global ICON grid. If the global data
are used, the parameter yicon_grid_lfn has to be specified to

yicon_grid_lfn = icon_grid_0026_R03B07_G.nc,

where the "G" marks the global grid.

For our NWP partners, who run the COSMO-Model for daily operational productions, we provide
special grid files for the corresponding regional domain. The file names of these grid files contain
the name of the region and look like:

yicon_grid_lfn = icon_grid_<name-of-region>_R03B07.nc.

Note that the necessary external parameters for the ICON domain and also an additional HHL-file
(see Sect. 2.5)) also have to be specified either for the global domain or a regional domain and have
to fit to the ICON grid file.

COSMO-Model Tutorial CHAPTER 2. PREPARING EXTERNAL, INITIAL AND BOUNDARY DATA


20 2.5 Specifying Data Characteristics

2.5 Specifying Data Characteristics

This section explains some variables of the Namelist group DATA. In this Namelist group the direc-
tories and some additional specifications of the data have to be specified.

For every data set you can specify the format and the library to use for I/O by setting the appropriate
Namelist variables (see below) to:

’grb1’ to read or write GRIB1 data with DWD GRIB1 library.


’apix’ to read GRIB1 or GRIB2 data with ECMWF grib_api.
’api1’ to write GRIB1 data with ECMWF grib_api.
’api2’ to write GRIB2 data with ECMWF grib_api.
’ncdf’ to read or write NetCDF data with the NetCDF library.

Note that the Climate Community prefers NetCDF, while the NWP Community uses the GRIB
format.

2.5.1 External Parameters for the COSMO-Model

The following variables have to be specified for the dataset of the external parameters for the
COSMO-Model domain:

ie_ext=965, je_ext=773 size of external parameters


ylmext_lfn=’cosmo_d5_07000_965x773.g1_2013111400’ name of external parameter file
ylmext_cat=’/tmp/data/ext1/’ directory of this file
ylmext_form_read=’grb1’ to specify the data format
(or: ’ncdf’, ’apix’)

2.5.2 External Parameters for the Driving Model

The INT2LM also needs the external parameters for the coarse grid driving model. The following
variables have to be specified for this dataset:

yinext_lfn=’invar.i384a’, name of external parameter file for GME


yinext_lfn=’icon_extpar_0026_R03B07_G_20141202.nc’,
name of external parameter file for global ICON domain
yinext_lfn=’icon_extpar_<name-of-region>_R03B07_20141202.nc’,
name of external parameter file for a region
yinext_cat=’/tmp/data/ext2/’, directory of this file
yinext_form_read=’grb1’, to specify the data format ( or ’apix’ or ’ncdf’)

2.5.3 Input Data from the Driving Model

The following variables have to be specified for the input data sets of the driving model:

CHAPTER 2. PREPARING EXTERNAL, INITIAL AND BOUNDARY DATA COSMO-Model Tutorial


2.5 Specifying Data Characteristics 21

yin_cat=’/tmp/data/input/’, directory of input data


yin_hhl=’/tmp/data/input/’, name of the vertical grid HHL-file (necessary only
for COSMO and ICON GRIB2)
ybitmap_cat=’/tmp/data/btmp/’, directory of the bitmap file (for GME only)
ybitmap_lfn=’bitmp384’ name of the bitmap file (for GME only)
yin_form_read=’grb1’, to specify the data format (or ’apix’ or ’ncdf’)
ytunit_in=’f’, to specify the time unit and hence the file name of
the input data
yinput_type=’forecast’, to specify the time unit and hence the file name of
the input data

Some remarks:

• External parameters for the driving model:


The file names of the ICON external parameter files contain a date. This date specifies the
generation date of the external parameters. This ensures the usage of the correct ICON files.

• HHL-files:
ICON and the COSMO-Model both use the generalized vertical coordinate. This means that
the construction of the three-dimensional grid is a more complicated process, which cannot
be done easily. Therefore the full 3D-Grid is stored in a so-called HHL-file. In GRIB1, the
necessary meta data could be stored in GRIB records to reconstruct the three-dimensional
vertical grid. This is not the case any more in GRIB2. Therefore, if working with GRIB2, the
HHL-file has to be read by INT2LM for the COSMO-Model grid and also for the grid of the
driving model. HHL-files are always written in GRIB2.

2.5.4 COSMO Grid Output Data

The following variables have to be specified for the output data sets of the COSMO-Model:

ylm_cat= ’/tmp/data/out/’, directory of the results


ylm_form_write=’grb1’, data format for output data (or ’api1’, ’ncdf’)
ytunit_out=’f’, to specify the time unit and hence the file name of
the output data

There are different variants to specify the file name for the input and output data sets. Again, the
NWP- and the CLM-Communities use different settings here, which are specified by the variables
ytunit_in and ytunit_out, resp. While for NWP applications mainly the default ’f’ is used, the
CLM-Community takes the setting ’d’. For a more detailed explanation of these settings see the
User Guide for the INT2LM (Part V: Preprocessing, Sect. 6.5: Conventions for File Names).

COSMO-Model Tutorial CHAPTER 2. PREPARING EXTERNAL, INITIAL AND BOUNDARY DATA


22 2.6 Parameters to Control the INT2LM Run

2.6 Parameters to Control the INT2LM Run

This section explains only few basic variables of the Namelist group CONTRL. Most of the parameters
in this group are used to specify a certain behaviour of the INT2LM, depending on special demands
for the COSMO-Model. Please refer to the User Guide for the INT2LM (Part V: Preprocessing,
Sect. 7.1) for a full documentation of all variables.

2.6.1 Forecast Time, Domain Decomposition and Driving Model

The following parameters have to be specified in any case and need no longer explanation.

To specify the date and the length of the forecast

ydate_ini date when the forecast begins


hstart start hour relative to the start of the forecast
hstop stop hour relative to the start of the forecast
hincbound increment for boundary update given in hours

To specify which data should be computed

linitial compute initial data


lboundaries compute boundary data

To specify the driving model

yinput_model the name of the driving model is specified as a character


string. The following names are possible: ’ICON’, ’GME’,
’IFS’, ’COSMO’, ’CM’.

To specify the number of processors

nprocx number of processors in X-direction (longitudinal)


nprocy number of processors in Y-direction (latitudinal)
nprocio number of processor for asynchronous IO (usually 0)

2.6.2 Basic Control Parameters

Depending on the COSMO-Model configuration, additional initial and / or boundary data has to
be provided. To control this, the following Namelist parameters can be used. Note, that additional
external parameters and / or additional output fields from the driving model might be necessary
for some of these switches.

CHAPTER 2. PREPARING EXTERNAL, INITIAL AND BOUNDARY DATA COSMO-Model Tutorial


2.6 Parameters to Control the INT2LM Run 23

lforest provide additional external parameters for the distribution of evergreen


and deciduous forest
lsso provide additional external parameters to run the subgrid scale orogra-
phy scheme
lprog_qi interpolate cloud ice from the driving model
lprog_qr_qs interpolate rain and snow from the driving model
lmulti_layer_in use multi-layer input soil levels
lmulti_layer_lm compute multi-layer output soil levels
llake provide additional external parameters for the FLake model
lprog_rho_snow interpolate prognostic snow density from the driving model
lfilter_oro to apply a filter for the orography
lfilter_pp to apply a filter for the pressure deviation (after vertical interpolation)
lbalance_pp to compute a hydrostatic balanced pressure deviation (after vertical in-
terpolation)

2.6.3 Main Switch to choose NWP- or RCM-Mode

The main switch to choose between a NWP or a RCM application is (in both models, INT2LM and
the COSMO-Model), the Namelist variable lbdclim. For the NWP-mode, this variable is .FALSE.
If it is set to .TRUE., additional boundary data will be provided for the COSMO-Model for every
output set. These are the slowly varying variables like sea surface temperature, plant cover, leaf
area index, etc. In a typical NWP application, which is running only for few days, these variables
are held constant. This, of course, cannot be done in a climate application.

lbdclim to run in climate mode

2.6.4 Additional Switches Interesting for the RCM-Mode

itype_w_so_rel to select the type of relative soil moisture input


itype_t_cl to select the source for the climatological temperature
itype_rootdp to select the treatment of the external parameter for the root depth
itype_ndvi to select the treatment of plant cover and leaf area index
itype_calendar to specify a gregorian calendar or a calendar where every year has 360
days
luse_t_skin to use skin temperature for surface

Again note, that additional external parameters might be necessary for some settings of these
namelist variables.

Table 2.1 gives some example settings for the Namelist group /CONTRL/, that are used for COSMO-
EU, COSMO-DE and for the RCM-Mode, resp. Again, for a full documentation of these parameters
please refer to the User Guide for the INT2LM (Part V: Preprocessing, Sect. 7.1).

COSMO-Model Tutorial CHAPTER 2. PREPARING EXTERNAL, INITIAL AND BOUNDARY DATA


24 2.6 Parameters to Control the INT2LM Run

Table 2.1: Example of Namelist Variables for /CONTRL/

&CONTRL
Variable for icon2eu for eu2de for RCM-Mode
lreorder .FALSE. .FALSE. .FALSE.
yinput_model ’ICON’ ’COSMO’ ’CM’ new: ≥ 1.14
llake .FALSE. .FALSE. .FALSE.
lforest .TRUE. .TRUE. .TRUE.
lsso .TRUE. .FALSE. .TRUE.
lbdclim .FALSE. .FALSE. .TRUE.
lprog_qi .TRUE. .TRUE. .TRUE.
lprog_qr_qs .TRUE. .TRUE. .FALSE.
lmulti_layer_in .TRUE. .TRUE. .TRUE.
lmulti_layer_lm .TRUE. .TRUE. .TRUE.
lprog_rho_snow .TRUE. .TRUE. .FALSE.
itype_w_so_rel 1 1 0
itype_t_cl 0 0 1
itype_rootdp 0 0 3
itype_ndvi 0 0 0
itype_calendar 0 0 0
lfilter_oro .TRUE. .TRUE. .TRUE.
lfilter_pp .FALSE. .TRUE. .FALSE.
lbalance_pp .FALSE. .TRUE. .FALSE.
eps_filter 0.1 0.1 N.A.
norder_filter 1 5 N.A.
ilow_pass_oro 1 4 N.A.
ilow_pass_xso 0 5 N.A.
rxso_mask 0.0 625.0 N.A.
luse_t_skin .FALSE. .FALSE. .TRUE.

CHAPTER 2. PREPARING EXTERNAL, INITIAL AND BOUNDARY DATA COSMO-Model Tutorial


2.7 Exercises 25

2.7 Exercises

To do the following exercises, you have to adapt some namelist input for the INT2LM. If necessary,
take a look to the last sections, where the most important namelist parameters are given for special
NWP- and RCM-applications.

2.7.1 NWP-Mode

EXERCISE:
Choose a domain and prepare the initial and boundary conditions for up to 48 hours for
a 7 km resolution COSMO-Model grid. For reasons of memory and computer time, the
domain size should not be bigger than about 400 × 400 × 50 grid points.

• Choose an appropriate runscript from the INT2LM and adapt it to your needs.

• Choose an appropriate external parameter data set.

• Choose data from a driving model.

For available data sets see Appendix C.2 for external parameters and C.3 for driving
model data.

2.7.2 RCM-Mode

EXERCISE:
Prepare the initial and boundary conditions for one month for a 0.44◦ resolution COSMO-
CLM grid in the step-by-step mode using the NCEP input.

• Go to the directory $SPDIR/step_by_step/gcm_to_cclm and look at the run script


for INT2LM (run_int2lm).

• Adapt the variable SPDIR (if not done yet with the init.sh script).

• Submit the job and look at the results (see Appendix B for how to submit jobs on
mistral).

For available data sets see Appendix D.2 for external parameters and D.3 for driving
model data.

COSMO-Model Tutorial CHAPTER 2. PREPARING EXTERNAL, INITIAL AND BOUNDARY DATA


26 2.7 Exercises

CHAPTER 2. PREPARING EXTERNAL, INITIAL AND BOUNDARY DATA COSMO-Model Tutorial


27

Chapter 3

Running the COSMO-Model

In this lesson you will learn how to run the COSMO-Model, using the initial and boundary data
you produced in the last lesson.

3.1 Namelist Input for the COSMO-Model

The execution of the COSMO-Model can be controlled by several NAMELIST groups:

LMGRID specifying the domain and the size of the grid


RUNCTL parameters for the model run
TUNING parameters for tuning physics and dynamics
DYNCTL parameters for the adiabatic model
PHYCTL parameters for the diabatic model
DIACTL parameters for the diagnostic calculations
NUDGING controlling the data assimilation
INICTL parameters for the initialization of model variables
IOCTL controlling the environment
GRIBIN controlling the grib input
GRIBOUT controlling the grib output
SATCTL controlling computation of synthetic satellite images
EPSCTL controlling ensemble runs

All NAMELIST groups have to appear in corresponding input files INPUT_** in the order given
above. Default values are set for all parameters, you only have to specify values, that have to be
different from the default. The NAMELIST variables can be specified by the user in the run-scripts
for the COSMO-Model, which then create the INPUT_** files. See Appendices A.3 (NWP) and / or
B.3 (RCM) for details on the run scripts.
We will only describe some basic variables here. For a detailed explanation see the User Guide for
the COSMO-Model (Part VII: User Guide, Sect. 7).

COSMO-Model Tutorial CHAPTER 3. RUNNING THE COSMO-MODEL


28 3.1 Namelist Input for the COSMO-Model

3.1.1 The Model Domain

The model domain is specified by the variables in the Namelist group LMGRID. This group is similar
to the corresponding group in the INT2LM, but has fewer variables. For example, in the COSMO-
Model you cannot specify vertical coordinate parameters. These are given to the COSMO-Model
by the initial data (via Grib or NetCDF headers or hhl-files).

These are the variables in LMGRID with some example settings:

&LMGRID
Variable COSMO_EU COSMO_DE for RCM-Mode
ie_tot 665 421 101
je_tot 657 461 111
ke_tot 40 50 32
startlat_tot -17.0 -5.0 -24.09
startlon_tot -12.5 -5.0 -25.13
pollat 40.0 40.0 39.25
pollon -170.0 -170.0 -162.00
dlon 0.0625 0.0625 0.44
dlat 0.0625 0.0625 0.44

3.1.2 Basic Control Variables

In this group, the basic control variables are set. Among them are the variables for specifying the
date and the range of the simulation, and the number of processors for the domain decomposition.
These variables have the same name and meaning as the corresponding variables in INT2LM, group
CONTRL and are not explained again here.

Other variables are for specifying the time step, the number of boundary lines for the domain
decomposition and which components should be used during the run of the COSMO-Model

These are some example settings:

&RUNCTL
Variable COSMO_EU COSMO_DE for RCM-Mode
dt 66.0 25.0 240.0 time step in seconds
nboundlines 3 3 3 number of boundlines
lphys .TRUE. .TRUE. .TRUE. to run physical parameterizations
luseobs .TRUE. .TRUE. .FALSE. to run with observation processing
ldiagnos .TRUE. .TRUE. .TRUE. to produce diagnostics
luse_rttov .TRUE. .TRUE. .FALSE. to compute syn. satellite images
lartif_data .FALSE. .FALSE. .FALSE. to run idealized test cases

CHAPTER 3. RUNNING THE COSMO-MODEL COSMO-Model Tutorial


3.1 Namelist Input for the COSMO-Model 29

3.1.3 Settings for the Dynamics

In the group DYNCTL, either the Leapfrog dynamical core (a 3 time level scheme), or the Runge-Kutta
dynamical core (a 2 time level scheme) must be chosen. This is done by specifying the variable l2tls
either =.FALSE. for Leapfrog or =.TRUE. for Runge-Kutta. For both schemes, additional parameters
can be specified.
In Version 4.24 a new fast-waves solver has been introduced for the Runge-Kutta scheme, which
has the following features:

- Improvement of the accuracy of vertical derivatives and averages.


- Use of the divergence operator in strong conservation form.
- Isotropic treatment of the artificial divergence damping.

These characteristics now allow for more stable integrations also in mountainous terrain with steeper
slopes. The different fast-waves solver are chosen by the Namelist switch itype_fast_waves = 1
(old solver) or = 2 (new solver).
Usage of the Leapfrog dynamical core is no more recommended. Most COSMO applications have
been ported to the Runge-Kutta dynamical core now. There are some differences between the
COSMO_EU (with about 7 km resolution) and the COSMO_DE (with about 2.8 km resolution).
The following tables give some values for the different applications.
&DYNCTL
Variable COSMO_EU COSMO_DE for RCM-Mode
l2tls .TRUE. .TRUE. .TRUE. to choose 2- or 3-tl scheme
lcond .TRUE. .TRUE. .TRUE. cloud water condensation
rlwidth 85000.0 50000.0 500000.0 width of relaxation layer (*)
y_scalar_advect ’BOTT2_STRANG’ Bott advection
irunge_kutta 1 1 1 to select Runge-Kutta scheme
irk_order 3 3 3 order of Runge-Kutta scheme
iadv_order 3 5 5 order of advection scheme
itype_fast_waves 2 2 1 type of fast waves solver
ldyn_bbc .FALSE. .FALSE. .FALSE. dynamical bottom boundary
lspubc .TRUE. .TRUE. .TRUE. sponge layer with damping
nrdtau 5 5 10 damping at model top
(*) The width of the relaxation layer should extend about 15 grid points. Thus it should be resolution
in meters * 15.
For the horizontal diffusion, special diffusion factors can be specified for the boundary and for the
interior zone of the domain. They can be specified for the wind speeds, pressure, temperature and
the tracer constituents.
itype_hdiff 2 2 2 type of horizontal diffusion
hd_corr_u_in 0.25 0.1 - diffusion factor (interior domain) for wind
hd_corr_u_bd 0.25 0.75 - diffusion factor (boundary zone) for wind
hd_corr_t_in 0.0 0.0 - diffusion factor (interior domain) for temperature
hd_corr_t_bd 0.0 0.75 - diffusion factor (boundary zone) for temperature
hd_corr_p_in 0.0 0.0 - diffusion factor (interior domain) for pressure
hd_corr_p_bd 0.0 0.75 - diffusion factor (boundary zone) for pressure
hd_corr_trcr_in 0.0 0.0 - diffusion factor (interior domain) for tracers
hd_corr_trcr_bd 0.0 0.0 - diffusion factor (boundary zone) for tracers

COSMO-Model Tutorial CHAPTER 3. RUNNING THE COSMO-MODEL


30 3.1 Namelist Input for the COSMO-Model

3.1.4 Settings for the Physics

In the group PHYCTL, the different parametrizations can be switched on or off. Most parameterization
packages have additional parameters for special settings. In &PHYCTL, the most important difference
between the COSMO_EU and the COSMO_DE are in the settings of the convection scheme.
While COSMO_EU uses the Tiedtke convection (itype_conv=0), the COSMO_DE uses a shallow
convection scheme (itype_conv=3).

&PHYCTL
Variable COSMO_EU COSMO_DE for RCM-Mode
lgsp .TRUE. .TRUE. .TRUE. to run with microphysics
itype_gscp 3 4 3 to select special scheme
lrad .TRUE. .TRUE. .TRUE. to run with radiation
ltur .TRUE. .TRUE. .TRUE. to run with turbulence
lconv .TRUE. .TRUE. .TRUE. to run with convection
itype_conv 0 3 0 to select special scheme
nincconv 4 10 4 increment, when to call convection
lsoil .TRUE. .TRUE. .TRUE. to run with soil model
ke_soil 7 7 9 to select special scheme
lsso .TRUE. .FALSE. .TRUE. to run with SSO scheme

3.1.5 Settings for the I/O

There are 3 Namelist groups to specify various settings for Input and Output, like to choose Grib
or NetCDF as data format. In IOCTL, the basic settings, which are valid for all I/O, are specified.

Note also the special Namelist variable lbdclim, with which the climate mode of the COSMO-Model
is activated.

&IOCTL
Variable COSMO_EU COSMO_DE for RCM-Mode
lbdclim .FALSE. .FALSE. .TRUE. to run the climate mode
yform_read ’grb1’ ’grb1’ ’ncdf’ to choose the data format
yform_write ’grb1’ ’grb1’ ’ncdf’ to choose the data format
ngribout 1 3 3/7 set number of different outputs

Note, that also ’apix’ (for reading GRIB data with ECMWF grib_api) and ’api1’, ’api2’ (for
writing GRIB data with grib_api) can be specified to choose the data format.

The settings of the following groups depend on the special job and there is no need to give different
specifications here.

&GRIBIN
ydirini=’/gtmp/uschaett/gme2lm/’, directory of the initial fields
ydirbd=’/gtmp/uschaett/gme2lm/’, directory of the boundary fields
hincbound=1.0, increment for lateral boundaries
lchkini=.TRUE., lchkbd =.TRUE., additional control output
lana_qi=.TRUE., llb_qi=.TRUE., cloud ice is provided for analyses and lateral
boundaries

CHAPTER 3. RUNNING THE COSMO-MODEL COSMO-Model Tutorial


3.2 Exercises 31

&GRIBOUT
hcomb=0.0,48.0,1.0, start, stop and incr. of output
lcheck=.TRUE.,
yvarml=’default’,
yvarpl=’default’,
yvarzl=’default’,
lwrite_const=.TRUE.,
ydir=’/gtmp/uschaett/lm_test/with_out/’, directory of the result data

3.1.6 Settings for the Diagnostics

The COSMO-Model can produce some diagnostic output for a quick monitoring of the forecast.
Which output is generated can be specified in the group DIACTL.

&DIACTL
n0meanval=0, nincmeanval=1,
lgplong=.TRUE., lgpshort=.FALSE., lgpspec=.FALSE., n0gp=0, hincgp=1.0,
igp_tot = 3, 13, 25, 46,
jgp_tot = 31, 29, 96, 12,
stationlist_tot= 0, 0, 46.700, 6.200, ’RODGAU’,
0, 0, 50.050, 8.600, ’Frankfurt-Flughafen’,
53, 68, 0.0 , 0.0 , ’My-Own-Station’,

3.2 Exercises

3.2.1 NWP-Mode

EXERCISE:
Run the COSMO-Model using the initial and boundary conditions you produced in the
last session.

• Choose an appropriate runscript for the COSMO-Model and adapt it to your needs.
For coarser resolutions (up to 7 km), the run-scripts run_cosmo_eu or run_cosmo_7
can be used. For higher resolutions (to 2-3 km), the run-scripts run_cosmo_de or
run_cosmo_2 can be used.

• Use the initial and boundary data, which you produced in the last exercise

• Run the COSMO-Model

COSMO-Model Tutorial CHAPTER 3. RUNNING THE COSMO-MODEL


32 3.2 Exercises

3.2.2 RCM-Mode

EXERCISE:
Part I
Run the COSMO-CLM using the initial and boundary conditions you produced in the
last session.

• Still in the step-by-step mode, adapt, if applicable, the corresponding run script
for COSMO-CLM (run_cclm) to your needs (e.g. specifying the working directory
SPDIR), using the initial and boundary data which you produced in the last exercise.

• Submit the job, run SAMOA for a basic check of your results and analyse the results
further.

• When successful, do a second simulation (both steps INT2LM and CCLM) for
a smaller domain nested inside the European domain with a higher resolution of
0.0625◦ . For this, use the scripts run_int2lm and run_cclm in the directory
$SPDIR/step_by_step/cclm_to_cclm.

Part II
Run the INT2LM and COSMO-CLM in the chain environment (see Appendix B.3).

• Go to the directory $SPDIR/chain/gcm_to_cclm/sp001. Change the paths in


subchain and templates/int2lm.job.tmpl.

• Start the chain by typing ./subchain start.

• If the cclm simulation is successful you can find the output data in the directories
$SPDIR/chain/arch/sp001
and as post-processed time series in $SPDIR/chain/work/sp001/post.

• Similarly, you can do also the nesting simulation in the directory


$SPDIR/chain/cclm_to_cclm/sp002.

CHAPTER 3. RUNNING THE COSMO-MODEL COSMO-Model Tutorial


33

Chapter 4

Visualizing COSMO-Model Output


using GrADS

4.1 Introduction

The Grid Analysis and Display System (GrADS) is an interactive desktop tool that can be used
to visualize data from meteorological models. GrADS implements a 4-dimensional data model,
where the dimensions are latitude, longitude, level and time. Each data set is located within this
4-dimensional space by use of a data descriptor file. This describes the dimension environment by
the user as a desired subset of the 4-dimensional space. Data is accessed, manipulated and displayed
within this subset.

After specifying the data descriptor file, the gribmap program has to be run to build up a cor-
respondance between the data set (usually a grib-file) and the description file. gribmap scans the
given data set for the desired data and produces an additional file, called an index-file.

Once the data set has been scanned, data may be displayed using a variety of graphical output
techniques, including line, bar and scatter plots, as well as contour, shaded contour, streamline and
wind vector plots.

More information can be found on http://grads.iges.org/grads/head.html. This is the GrADS


home page from where you can download all files, programs and documentations necessary to install
GrADS on your local platform.

4.2 The data descriptor file

The data descriptor file has the suffix .ctl and specifies the subset of the 4-dimensional space. The
user has to specify ranges for all four dimensions:

Dimension Command Example


Time tdef tdef 13 linear 12Z13may2013 1hr
Longitudes xdef xdef 129 linear 1 1
Latitudes ydef ydef 161 linear 1 1
Vertical levels zdef zdef 40 levels

COSMO-Model Tutorial CHAPTER 4. VISUALIZING COSMO-MODEL OUTPUT USING GRADS


34 4.2 The data descriptor file

The full syntax for the zdef command is

zdef number mapping <start increment> <value-list>.

with

number the number of grid values in the z-direction, specified as an integer number.

mapping the mapping type, specified as a keyword. Valid keywords are

linear linear mapping.


levels arbitrary levels (e.g. pressure or σ-levels).

start when mapping is linear, this is the starting value.

increment when mapping is linear, this is the increment in the z direction.

value-list when mapping is levels, the specific levels are simply listed in ascending order.

The specification of the value-list for the COSMO-Model σ-levels is not straightforward. You will
find data descriptor files for 40 and 50 vertical levels in

/e/uhome/fegast3/TRAINING_2016/grads

(called test_m40.ctl and test_m50.ctl, resp.). There are also example data descriptor files for
data on p- and on z-levels (called test_p.ctl and test_z.ctl, resp.).

Other necessary specifications for the data descriptor file are:

• The directory and the name of the data set, e.g.


dset /gtmp/uschaett/test_vampir/laf2002012300

• The name of the index file, e.g.


index laf.idx

• The number of the grid that is used. The COSMO-Model grid in rotated latitude-longitude
coordinates is not defined in the official tables of GrADS, so this specification has to be
dtype grib 255 (which means a user-defined grid).

• A value that should be placed instead of undefined values in a record (if any).
undef 9.999E+20

The third section is the specification of the variables that should be displayed. This section begins
with VARS nn, where nn is the number of variables that are specified and ends with ENDVARS. The
variables are determined by means of the grib code headers, e.g.:

Name NLEV EE LEVTYP LEVBOT Description


HSURF 0 8 1 0 ** Geometric height [m]
T 40 11 110 0 ** Temperature [K]
U10M 0 33 105 10 ** u wind [m/s] in 10 m

with

CHAPTER 4. VISUALIZING COSMO-MODEL OUTPUT USING GRADS COSMO-Model Tutorial


4.3 The gribmap program 35

NLEV Number of vertical levels (0 stands for a two-dimensional field).

EE Element number in the Grib1 tables.

LEVTYP Leveltyp in the Grib1 tables.

LEVBOT Bottom level for Leveltyp 105.

A weakness of GrADS is, that it cannot distinguish between different Grib tables. Therefore, vari-
ables from different Grib tables, but with the same element number cannot be displayed properly.
This holds for example for the wind speed U (in Grib Table 2, ee=33) and cloud ice QI (in Grib
Table 201, ee=33).

Figure 4.1 shows an example for a data descriptor file.

4.3 The gribmap program

gribmap produces the connection between the data set specified in the data descriptor file and
the variables contained in this data set and also specified by the data descriptor file. The usage of
gribmap is:

gribmap -v -i test_m40.ctl

where:

-v Option for verbose output.

-i Option for specifying a data descriptor file for input.

If the -v option is specified, gribmap gives detailed information about every record processed from
the Grib file. This version is helpful to detect possible errors in the data descriptor file.

4.4 A Sample GrADS Session

Before you start

The following is based on a tutorial taken from the GrADS web page. The Grib output from the
reference data set (7 km version) is needed to go through this sample session. To avoid the problem
with same element numbers in different Grib tables and to have a higher output frequency, rerun
the run_cosmo_7-job with the following changes in the Namelist group GRIBOUT:

• eliminate QI from the output list in yvarml.

• For a higher output frequency set hcomb=0.0,78.0,1.0.

Then you need the following files:

COSMO-Model Tutorial CHAPTER 4. VISUALIZING COSMO-MODEL OUTPUT USING GRADS


36 4.4 A Sample GrADS Session

dset /gtmp/for2sch/test_program/laf2002012300
index laf_test.idx
dtype grib 255
undef 9.999E+20
*
tdef 1 linear 00Z23jan2002 1hr
xdef 325 linear 1 1
ydef 325 linear 1 1
zdef 35 levels
8996 8739 8482 8225 7968 7711 7454 7197 6940 6683 6426 6169 5912 5655 5398 5141
4884 4627 4370 4113 3856 3599 3342 3085 2828 2571 2314 2057 1800 1543 1286 1029
772 515 258
*
VARS 32
HSURF 0 8, 1,0 ** Geometric height [m]
SOILTYP 0 57, 1,0 ** type of soil [idx]
ROOTDP 0 62, 1,0 ** depth of roots [m]
FR_LAND 0 81, 1,0 ** Land−sea mask (land=1;sea=0) [fraction]
PLCOV 0 87, 1,0 ** Vegetation [%]
PS 0 1, 1,0 ** Surface pressure [Pa]
PMSL 0 2,102,0 ** Pressure reduced to MSL [Pa]
QV_S 0 51, 1,0 ** Specific humidity [kg/kg]
TOT_PREC 0 61, 1,0 ** Total precipitation [kg/m^2]
RAIN_GSP 0 102, 1,0 ** grid scale rain [kg/m^2]
RAIN_CON 0 113, 1,0 ** convective rain [kg/m^2]
SNOW_GSP 0 79, 1,0 ** grid scale snow [kg/m^2]
SNOW_CON 0 78, 1,0 ** Convective snow [kg/m^2]
VMAX_10M 0 187,105,10 ** conv. cloud cover [%]
U10m 0 33,105,10 ** u wind [m/s]
V10m 0 34,105,10 ** v wind [m/s]
TMAX2m 0 15,105,2 ** Max. temp. [K]
TMIN2m 0 16,105,2 ** Min. temp. [K]
TD2m 0 17,105,2 ** Dew point temp. [K]
T2m 0 11,105,2 ** Temperature [K]}
U 35 33,110,0 ** u wind [m/s]
V 35 34,110,0 ** v wind [m/s]
T 35 11,110,0 ** Temperature [K]}
P 35 1,110,0 ** Pressure deviation
QC 35 31,110,0 ** Specific cloud water content [kg/kg]
QV 35 51,110,0 ** Specific humidity [kg/kg]
T_S 0 85,111,0 ** Soil temp. [K]
T_M 0 85,111,9 ** Soil temp. [K]
T_CL 0 85,111,41 ** Soil temp. [K]
W_G1 0 86,112,10 ** Soil moisture content [kg/m^2]
W_G2 0 86,112,2660 ** Soil moisture content [kg/m^2]
W_CL 0 86,112,25790 ** Soil moisture content [kg/m^2]
ENDVARS

Figure 4.1: Example for a GrADS data descriptor file

CHAPTER 4. VISUALIZING COSMO-MODEL OUTPUT USING GRADS COSMO-Model Tutorial


4.4 A Sample GrADS Session 37

all_lfff_refdata This grib files contains all output files from the COSMO-Model ref-
erence data set.

lfff00000000c, lfff00000000, . . . , lfff00120000

If you made the reference run, you can create it by typing

cat lfff000000c lfff*00 > all_lfff_refdata

ref_model.ctl The GrADS data descriptor file for the reference data set.

The grib file all_lfff_refdata and the data descriptor file ref_model.ctl can be accessed from
/e/uhome/fegast3/TRAINING_2016/data.
Note that there is a tool grib2ctl, shortly described in section (4.5.2), which can automatically
generate .ctl-files from a given grib-file.
You may want to look at the descriptor file before continuing. Especially the description of the
subset of the 4-dimensional space needs to be explained in detail:

tdef 13 linear 12Z13may2013 1hr This data set contains COSMO-Model output data for
13 hours (12-24 UTC) from 13. May 2013 in hourly
intervals.
xdef 129 linear 1 1 129 grid points in longitude direction.
ydef 161 linear 1 1 161 grid points in latitude direction.
zdef 40 levels 40 vertical levels.
10281 10024 9767 9510. . . specification list of the vertical coordinates.
Please copy the data descriptor file to a local directory before proceeding.

GrADS Environment Variables


GrADS is working with scripts, that can be executed during a GrADS session. Also you have to
access special data e.g. to display a map background. The directories, where data and scripts are
stored, have to be specified by special GrADS environment variables. GADDDIR specifies the directory
with special GrADS data and GASCRP defines the directory holding GrADS scripts. The settings for
this course are:
export GADDIR=/e/uhome/fegast3/grads/data
export GASCRP=/e/uhome/fegast3/TRAINING_2016/grads/grads_scripts

Starting GrADS
GrADS can be started interactively by using the command grads. The user is prompted for specify-
ing a landscape or a portrait window. Normally, a landscape window is chosen, so just press ENTER.
GrADS then opens a graphic output window on your console, in which the data will be displayed.
This window can be moved or resized. You have to enter GrADS commands from the window where
you started GrADS – this window will need to be made the active window and you must not entirely
cover that window with the graphics output window.
In the text window (from where you started GrADS), you should now see the GrADS prompt ga->.
You have to enter GrADS commands at this prompt and you will see the results displayed in the
graphics output window.
First, the data descriptor file has to be opened. For that you have to enter:

COSMO-Model Tutorial CHAPTER 4. VISUALIZING COSMO-MODEL OUTPUT USING GRADS


38 4.4 A Sample GrADS Session

ga-> open ref_model.ctl

You can take a look on the contents of this file by entering

ga-> query file

Displaying Data

Data can be displayed by using the command display (or abbreviated: d):

ga-> d var_name

where var_name is the name of the variable as it was defined in the data descriptor file, e.g.:

ga-> d u

ga-> d hsurf

Note that GrADS displays an x-y-plot at the first time and at the lowest level in the data set. Per
default, contour lines are plotted. This can be changed by entering

ga-> set gxout style

where style is one of the following:

contour the default

shaded coloured plots

grfill the single grid cells can be seen in the coloured plot

grid the values are written in the grid cell (not useful for bigger domains)

vector for displaying special variables (see below)

stream for displaying special variables (see below)

Display some variables in different styles, e.g.

ga-> d u (the default are contour lines)

ga-> c (for clear: clears the display)

ga-> set gxout shaded

ga-> d u

ga-> c

CHAPTER 4. VISUALIZING COSMO-MODEL OUTPUT USING GRADS COSMO-Model Tutorial


4.4 A Sample GrADS Session 39

ga-> set gxout grfill

ga-> d u

For the styles shaded and grfill, you can draw a colour-bar to see the range of the values of the
displayed variable by entering

ga-> run cbar

Note that cbar also is a user defined grads script.

Using the styles vector or stream, you can visualize e.g. the wind:

ga-> set gxout vector

ga-> d u;v

ga-> c

ga-> d skip(u,5,5);skip(v,5,5) (Note the difference!)

ga-> c

ga-> set gxout stream

ga-> d u;v

ga-> c

By specifying a 3rd field, the vectors can also be colourized, e.g.

ga-> set gxout vector

ga-> d u;v;t

ga-> c

You will not see very much with the style grid, unless you change the size of the displayed domain.

Altering the dimensions

You now will enter commands to alter the dimension environment. The display command (and,
implicitly, the access, operation, and output of the data) will do things with respect to the current
dimension environment. You control the dimension environment with the set command.

ga-> c

ga-> set x 35

ga-> set y 30

ga-> d t

COSMO-Model Tutorial CHAPTER 4. VISUALIZING COSMO-MODEL OUTPUT USING GRADS


40 4.4 A Sample GrADS Session

In the above sequence, the x- and y-dimensions are set to a single value. Because the z- and t-
dimensions are also only a single value, all dimensions are now fixed and the result is just a single
value, in this case the value for grid point (35,30), at the lowest level and the 1st time in the data
set.

If you now enter

ga-> c

ga-> set x 1 129

ga-> d t

Now x (the longitude) is a varying dimension and displaying a variable will give a line graph.

Now enter

ga-> c

ga-> set y 1 161

ga-> d t

and you will get a two-dimensional plot again. You can also choose a subdomain for plotting:

ga-> c

ga-> set x 31 43

ga-> set y 25 45

ga-> d t

And on such a subdomain you can take a look on the values of the grid cells:

ga-> c

ga-> set gxout grid

ga-> d t

If also a third dimension is varied, you will get an animation sequence, which can be done e.g.
through time:

ga-> c

ga-> set gxout shaded

ga-> set x 1 129

ga-> set y 1 161

ga-> set t 1 13

CHAPTER 4. VISUALIZING COSMO-MODEL OUTPUT USING GRADS COSMO-Model Tutorial


4.4 A Sample GrADS Session 41

ga-> d t

The command to change the vertical level is the same

ga-> set z num

where num is a number from 1 (surface) up to the number of vertical levels (here: 40: top of
atmosphere).

By setting the y- (latitude) and z- (level) dimension to vary, you can plot vertical cross sections
(Exercise!)

Operations on data

GrADS offers the possibility to perform operations or several built in funtions on the data. To see
the temperature in Fahrenheit instead of Kelvin, you can do the following conversion:

ga-> d (t-273.15)*9/5+32

(How can you display the temperature in Celsius?). If you use the display style shaded, you should
also plot the colour bar to see the differences.

Any expression may be entered that involves the standard operators of +, -, * and /, and which
involves operands which may be constants, variables, or functions. Here is an example involving
functions to calculate the magnitude of the wind. The sequence

ga-> c

ga-> d sqrt(u*u+v*v)

can be replaced by the function

ga-> d mag(u,v)

Another built in function is the averaging function:

ga-> d ave(ps,t=1,t=7)

In this case the mean surface pressure over 7 values (from 00 UTC to 06 UTC!) is calculated.

You can also perform time differencing

ga-> c

ga-> d ps(t=2) - ps(t=1)

COSMO-Model Tutorial CHAPTER 4. VISUALIZING COSMO-MODEL OUTPUT USING GRADS


42 4.4 A Sample GrADS Session

to see the change of a variable between two output times.

In this example you see a more common form to refer to a variable by explicitly setting the time in
brackets. You can also do such a calculation by using an offset from the current time:

ga-> c

ga-> d ps(t+1) - ps

The complete specification of a variable name is:

ga-> name.file (dim +| − | = value, . . .)

If there are two files open, perhaps one with model output, the other with analyses, you can also
take the difference between two fields by entering

ga-> d ps.2 - ps.1

Printing data

Newer version of GrADS have a comfortable way of printing data to a gif-file:

ga-> printim file_name

where file_name is the name that is given to the file. You can take a look on the contents of
file_name by using display.

Ending GrADS

The command to end the GrADS session is

ga-> quit

4.4.1 Drawing a Map Background

Changing the ctl-file

To draw a map background is an advanced feature when visualizing COSMO-Model output with
GrADS, because of the rotated latitude-longitude grid the predefined GrADS maps cannot be used.
But we have some special maps for rotated grids that are used for the reference run. The maps are
provided in a low, medium and high resolution over Europe for the rotation with pollat=32.5:

gmt_l_rot_eu low resolution map

gmt_m_rot_eu medium resolution map

gmt_h_rot_eu high resolution map

CHAPTER 4. VISUALIZING COSMO-MODEL OUTPUT USING GRADS COSMO-Model Tutorial


4.4 A Sample GrADS Session 43

and for the rotation with pollat=40.0:

gmt_l_rot_40.0 low resolution map

gmt_m_rot_40.0 medium resolution map

These maps can be accessed from the directory /e/uhome/fegast3/grads/data. For other rotations,
other maps have to be used. To use these maps, the rotated coordinates of the latitudes and the
longitudes have to be specified in the data descriptor file:

xdef 129 linear -4.0 0.0625

ydef 161 linear -5.0 0.0625

Note: the setting before counted the grid points, now the rotated coordinates are specified with the
start value (-3.5 and -12.0, resp) and the increment (the resolution) 0.125 (degree).

Then gribmap has to be run again.

Control the map drawing

Start GrADS, open the data descriptor file and run the initialization script.

ga-> open ref_model.ctl

ga-> run pin

The script pin.gs (.gs stands for grads script) just specifies the plotting area and defines some
COSMO-Model specific features. It can be accessed from the GrADS scripts-directory, which is set
by an environment variable (see above). You can take a look to see how this script looks like.

To control the map drawing, you have to set

ga-> set mproj latlon (sets the map projection to a lat-lon grid where the aspect ration is
maintained)

ga-> set mpdset < gmt_l_rot_40.0 | gmt_m_rot_40.0 >

If you now display a variable, the political borders and the coastlines are drawn by default in
addition to the variable.

ga-> d hsurf

To switch off the borders and / or the coastlines you can use the set mpt command. The syntax is

set mpt type < off | <color>,<style>,<thickness>>

where type, color, style and thickness can be chosen from the following table:

COSMO-Model Tutorial CHAPTER 4. VISUALIZING COSMO-MODEL OUTPUT USING GRADS


44 4.4 A Sample GrADS Session

type *: asterisk for application to all types


1: coastlines
2: political borders
3: main rivers
4: additional rivers
5: canals
color values for GrADS default colours range from 1 to 16
(1: black; 2: red; 3: green; 4: blue, etc.)
style 1: solid
2: long dash
3: short dash
4: long dash, short dash
5: dotted
6: dot dash
7: dot dot dash
thickness values range from 1 to 6

To also switch on the main rivers in a black dotted line, you can enter

ga-> set mpt 3 1 5

and then display a variable. To switch all maps off, you can type

ga-> set mpt * off

Note, that if you now want to set a dimension to a special grid point, you cannot use the numbers
1, . . . , 161 any more, but have to specify the values for the latitudes and longitudes, resp.

4.4.2 Use of GrADS in Batch mode

You have already used the scripts pin.gs or cbar.gs above. You can easily generate your own
scripts by embracing the most commands by an apostrophe (exceptions are variable definitions, ...).
A short example is shown here

# comments must start with a # in the FIRST COLUMN!

’open test_m40.ctl’

’run pin’
’set gxout grfill’

’d t’

’printim nice_plot.gif’

CHAPTER 4. VISUALIZING COSMO-MODEL OUTPUT USING GRADS COSMO-Model Tutorial


4.5 Helpful Tools: wgrib and grib2ctl 45

Let us name this script ’myscript.gs’. You can of course call this script during your grads session;
but you can call it by a shell command, too:

> grads -b -c ’myscript.gs’

To automatically quit grads you should finish the above script by a line ’quit’. Using scripts in
this way allows to use GrADS as a graphic production tool.

4.5 Helpful Tools: wgrib and grib2ctl

As already mentioned above, GrADS has problems when two variables should be visualized that have
the same element number but are from different Grib tables. Another problem is the visualization
of threedimensional variables, that have different vertical locations. In the COSMO-Model, most
atmospheric variables are defined on main levels, while the vertical velocity W (and some related
variables) are defined on half levels, i.e. the interfaces between the main levels. This is the reason,
why 41 different twodimensional levels are defined for W in the ctl-file test_m40.ctl. But therefore
it is not possible to draw cross sections for W.

There are tools which offer now convenient solutions for these problems. With wgrib it is possible to
extract certain variables out of a Grib file into a new Grib file. And if only one type of atmospheric
variable is present, it is no problem for GrADS. And with the tool grib2ctl, a ctl-file can be
automatically created, so there is no need to keep several of these files.

4.5.1 wgrib

A list of contents of a Grib file:

To see a list of all variables in a Grib file <name> type:

wgrib <name>.

The listing gives you (among others) information about

d=09122112: the date

U: the name used in the COSMO-Model

kpds5=33: the element number (here defined by kpds5)

kpds6=110: the leveltyp (110 are variables on main levels, 109 variables on half levels)

Extracting variables out of a Grib file:

The output of the listing above can be piped into wgrib again to extract certain variables, which
you have to specify by the element number (or kpds5):

wgrib <name> | egrep -i <string> | wgrib -grib -i <name> -o <name>_suffix,

where <string> is e.g.

COSMO-Model Tutorial CHAPTER 4. VISUALIZING COSMO-MODEL OUTPUT USING GRADS


46 4.5 Helpful Tools: wgrib and grib2ctl

":U:kpds5=33:kpds6=110:" to extract U

":V:kpds5=34:kpds6=110:" to extract V

":W:kpds5=40:kpds6=109:" to extract W

":T:kpds5=11:kpds6=110:" to extract T

":P:kpds5=1:kpds6=110:" to extract P

":PP:kpds5=139:kpds6=110:" to extract PP

":QI:kpds5=33:kpds6=110:" to extract QI

You can also add several entries with the | in between:


":U:kpds5=33:kpds6=110:|:V:kpds5=34:kpds6=110:"

4.5.2 grib2ctl

Since early versions of GrADS, a tool grib2ctl has been offered, that automatically creates ctl-files
for a special grib file:

grib2ctl <gribfile>.

But early versions could not handle the rotated lat-lon grid of the COSMO-Model and the different
grib tables, that are used in the COSMO-Model. Since some time, a version is available, that works
better on GRIB data from the COSMO-Model. At DWD, a perl-script has been written, based
on grib2ctl and the new GRIB library grib_api, which is called gribapi2ctl. Running on a
GRIB file, it creates the .ctl file and runs already the gribmap, so that you can directly start
grads. gribapi2ctl also works for GRIB2 data. It only has one weakness, that still the different
vertical leveltypes in GRIB1 cannot be handled correctly. The vertical velocity W should be handled
separately, therefore.

The perl-script can be found in /e/uhome/fegast3/TRAINING_2016/grads/bin

4.5.3 Exercise

EXERCISE:
Extract the vertical velocity W out of one grib file and try to plot a x-z cross section.

CHAPTER 4. VISUALIZING COSMO-MODEL OUTPUT USING GRADS COSMO-Model Tutorial


47

Chapter 5

Postprocessing and visualization of


NetCDF files

5.1 Introduction

Many visualization programs such as GrADS, R or Matlab now include packages with which NetCDF
data files can be handled. In this course we will use a very simple program, ncview, which does not
have a large functionality but is a very easy-to-use program for a quick view of NetCDF output files
and therefore very useful for a first impression.
For a quick overview of dimensions and variables, the tool ncdump can be used, which is part of
the basic NetCDF package. This will shortly be described first. More sophisticated tools exist for
e.g. cutting out subsets of data and producing averages or time series. Two of these tools which are
currently frequently used in regional climate modelling are the cdo and nco utilities.
In the CLM-community, a tool for combining temperature and precipitation fields to monthly values,
cut out the PRUDENCE regions and compare to E-OBS observation data exists. This tool is called
ETOOL and is available on the CLM-community web page.
Note that for this years exercises, the netcdf package of version 4 or higher is needed and some
netcdf files (e.g. external parameter file or tar-archived CCLM output) are internally compressed
using nccopy. Also, the netcdf tools ncview, CDO and NCO have to be compiled and linked with the
netcdf4 package. This may cause problems if you want to handle the data files on other computer
systems.

5.2 ncdump

Ncdump comes with the NetCDF library as provided by Unidata and generates a text representation
of a NetCDF-file on standard output. The text representation is in a form called CDL (network
Common Data form Language). Ncdump may be used as a simple browser for NetCDF data files,
to display the dimension names and sizes; variable names, types and shapes; attribute names and
values; and optionally, the data values themselves for all or selected variables in ASCII format. For
example, to look at the structure of a NetCDF file use
ncdump -c data-file.nc
Dimension names and sizes, variable names, dependencies and values of dimensions will be displayed.
To get only header information (same as -c option but without the values of dimensions) use

COSMO-Model Tutorial CHAPTER 5. POSTPROCESSING AND VISUALIZATION OF NETCDF FILES


48 5.3 CDO and NCO

ncdump -h data-file.nc
To display the values of a specified variable which is contained in the NetCDF file type
ncdump -v <Variable> data-file.nc
To send data to a text file use
ncdump -b c data-file.nc > data-file.cdl
to produce an annotated CDL version of the structure and the data in the NetCDF-file data-file.nc.
You can also save data for specified variables for example in *.txt-files just using:
ncdump -v <Variable> data-file.nc > data-file.txt
For further information on working with ncdump see
http://www.unidata.ucar.edu/software/netcdf/docs/ncdump-man-1.html

5.3 CDO and NCO

CDO (Climate Data Operators) is a collection of command line operators to manipulate and analyse
NetCDF data, which have been developed at MPI for Meteorology in Hamburg. Source code and
documentation are available from:
https://code.zmaw.de/projects/cdo
The tool includes more than 400 operators to: print information about datasets, copy, split and
merge datasets, select parts of a dataset, compare datasets, modify datasets, arithmetically process
datasets, to produce different kind of statistics, to detrend time series, for interpolation and spectral
transformations. The CDOs can also be used to convert from GRIB to NetCDF or vice versa,
although some care has to be taken there.
NCO is short for "NetCDF Operator". It has been developed in the US and is now an Open
Source Project. As CDO, the NCOs are a collection of command line operators. Source code and
documentation are available from:
http://nco.sourceforge.net
The main operators are: arithmetic processor, attribute editor, binary operator, ensemble averager,
ensemble concatenator, file interpolator, kitchen sink, permute dimensions quickly, pack data quietly,
record averager, record concatenator, renamer, weighted averager.
Depending on the specific task of your postprocessing steps, it is useful to have a closer look at
whether a CDO or NCO operator would be more suitable.

5.4 ncview

Ncview is a visual browser for NetCDF format files developed by David W. Pierce. Using ncview
you can get a quick and easy look at your NetCDF files. It is possible to view simple movies of
data, view along different dimensions, to have a look at actual data values at specific coordinates,
change colour maps, invert data, etc.
To install ncview on your local platform, see the ncview-Homepage:
http://meteora.ucsd.edu/~pierce/ncview_home_page.html

CHAPTER 5. POSTPROCESSING AND VISUALIZATION OF NETCDF FILES COSMO-Model Tutorial


5.5 Further tools for visualization and manipulation of NetCDF data 49

You can run the program by typing:

ncview data-file.nc

which will open a new window with the display options.

If data-file.nc contains wildcards such as ’*’ or ’ ?’ then all files matching the description are scanned,
if all of the files contain the same variables on the same grid. Choose the variable you want to view.
Variables which are functions of longitude and latitude will be displayed in 2-dimensional images.
If there is more than one time step available you can easily view a simple movie by just pushing
the forward button. The view of the image can be changed by varying the colours of the displayed
range of the data set values or by adding/removing coastlines. Each 1- or 2-dimensional subset of
the data can be selected for visualization. Ncview allows the selection of the dimensions of the fields
available, e.g. longitude and height instead of longitude and latitude of 3D fields.

The pictures can be sent to *.ps-output by using the function print. Be careful that whenever you
want to close only a single plot window to use the close button, because clicking on the x-icon on
the top right of the window will close all ncview windows and terminate the entire program.

5.5 Further tools for visualization and manipulation of NetCDF


data

There are many more programs for viewing and manipulating NetCDF data. A list of freely available
software as well as commercial or licensed packages can be found at
http://www.unidata.ucar.edu/software/netcdf/software.html.

Furthermore, there are now interfaces and routines to handle NetCDF data for almost every pro-
gramming and script language available.

5.6 ETOOL

The ETOOL is a shell script which is designed to perform some basic analysis of COSMO-CLM
output data and to compare it to the E-OBS gridded observational data. Monthly statistics for tem-
perature and precipitation are calculated, and the European ’PRUDENCE’ regions are extracted.
It is also possible to define custom regions. The program works for all horizontal resolutions, as
the validation is calculated only for area means of the selected sub-regions. Note that only land
points (FR_LAND >= 0.5) are included in the analysis, since the model output is remapped onto
the E-OBS grid. To run the program you need the E-OBS data set and the CDO package, as the
calculations are performed using the CDO commands.

The script has been provided by Peter Berg from KIT and is available from the CLM-community
web page1 .

5.7 Exercises

The post-processing exercises will help you to handle the model output and to do some first evalua-
tions of your climate simulations compared to observations. The NCO and CDO tools will be used
1
http://www.clm-community.eu/index.php?menuid=85&reporeid=79

COSMO-Model Tutorial CHAPTER 5. POSTPROCESSING AND VISUALIZATION OF NETCDF FILES


50 5.7 Exercises

for that, and we will have a look at the ETOOL. We use the output of the 2-month sp001 chain
experiment and the E-OBS data set.

The PDF output files can be open with the evince command.

EXERCISES:

• Have a look at your data (model output and observations) using ncview:
ncview <file>.nc

• Practise the use of the NCO tools. You can use the documentation nco.pdf if you
need detailed information about a command.

• Practise the use of the CDO tools. You can use the documentation cdo.pdf if you
need detailed information about a command.

• Have a look at the ETOOL script and practise its usage.

CHAPTER 5. POSTPROCESSING AND VISUALIZATION OF NETCDF FILES COSMO-Model Tutorial


51

Chapter 6

Running Idealized Test Cases

6.1 General remarks

A so-called idealized case is a model run without coupling/nesting to any sort of coarser “driving”
model. There is no need for preprocessing with INT2LM. Instead, all necessary initial and boundary
data are computed within the COSMO-Model itself. To accomplish this, some simplifications can
be introduced, e.g., horizontally homogeneous profiles of p, T and ~v , time-invariant lateral boundary
data, running without soil model and radiation, idealized terrain-shapes etc., but more complicated
things are imaginable as well.

Many analytic results from theoretical meteorology are based on the solution of the atmospheric flow
equations under simplified conditions (e.g., wave flow over hills), and it is often possible to reproduce
these conditions numerically in the idealized framework. Therefore, the ability to run idealized model
setups serves, on the one hand, as a simple possibility to test the correctness of particular aspects of
the COSMO model (e.g., the dynamical core only, some physical parameterizations) by comparison
to such analytical reference solutions from theory, and, on the other hand, it enables the researcher
to investigate atmospheric processes in their pure form.

To run the model in idealized mode, the following “ingredients” have to be defined/computed within
the model, which would normally, at least partly, come from other sources (INT2LM, EXTPAR):

- height distribution of vertical model levels,

- orography,

- external parameters, depending on the configuration (e.g., run w/o soil model, radiation),

- initial conditions for p, T and ~v (can be simplified, so that for any prognostic variable holds
φ = φ(z)),

- boundary conditions (e.g., same as initial condition and held constant in time, or periodic, or
“open”),

- For studies of convection: artificial convection triggers, random noise.

The code for these ingredients is implemented in the source code file src_artifdata.f90. This file
contains a quite large number of subroutines, offering different possibilities for different things, but
it is certainly not exhaustive and may be extended by the user on its own.

COSMO-Model Tutorial CHAPTER 6. RUNNING IDEALIZED TEST CASES


52 6.1 General remarks

However, a lot of idealized cases can already be configured by the namelist parameters
of the namelist group ARTIFCTL in the file INPUT_IDEAL, and for most beginning users
there will be no need to change the source code.

This namelist-driven one-source-code approach not only facilitates parameter changes, but more
importantly allows to freely combine the various ingredients to create “new” cases. This flexibility
offers, e.g., the possibility for model developers to quickly set up an idealized case tailored to test
their current development. And it opens up new ways for meteorological researchers to conduct
idealized studies on atmospheric processes.

To get the user started, we provide several (well commented) example runscripts for common ide-
alized cases (e.g., 2D flow over hill, 3D “warm bubble” triggered convective systems, quasi-LES
simulations for studies on shallow convection, . . . ), see below. These example scripts are available
in the subfolder RUNSCRIPTS of the COSMO-model package and their names start with the suffix
run_ideal. The idea is that the user copies one of the scripts that is “closest” to his/her intended
case setup to a separate directory, modifies it accordingly and runs it from there. The script named
just run_ideal is a script which contains all namelist parameters of ARTIFCTL as a reference. Doc-
umentation can be found in the document artif_docu.pdf, see below.

However, as has been said, the user is free to add own specialized ingredients to src_artifdata.f90,
if he is not satisfied with the possibilities already given. Most of such ingredients are likely to fit
into the present code structure of this file at some more or less well-defined entry points, and the
top-level routines of src_artifdata.f90 are:

- SUBROUTINE input_artifctl: Input of the various namelist parameters. You can add own
ones here by orienting yourself on the already present parameters.

- SUBROUTINE gen_ini_data: Is called once at the beginning of a run and generates vertical
coordinates, orography, initial atmospheric profiles (~v , T , moisture, p) and, if needed, soil- and
other external parameters. Here most of the namelist parameters take effect, and a number
of subroutines for certain tasks are called. This is the routine where own code for the state
should be put in.

- SUBROUTINE gen_bound_data: Is called in every timestep and sets the boundary fields.

These subroutines are extensively commented, so do not hesitate to dive into the code! You will see
that there are a lot of (well-commented) child-subroutines, which are called from the above top-level
ones. However, a detailed description is beyond the scope of this introduction.

Necessary material for the exercises in this chapter can be found in:

/e/uhome/fegast3/TRAINING_2016/ideal

- The runscripts run_ideal, run_ideal_hill, run_ideal_hill_2, run_ideal_wk82, which are


copies from the RUNSCRIPTS folder.

- In the subdirectory Doc:

· GrADS reference card, wgrib readme,


· artif_docu.pdf – Documentation of running idealized cases and of the
namelist ARTIFCTL, also available on the web from http://www.cosmo-
model.org/content/model/documentation/core/artif_docu.pdf

CHAPTER 6. RUNNING IDEALIZED TEST CASES COSMO-Model Tutorial


6.1 General remarks 53

· articles of Weisman, Klemp (1982) for the warm bubble test, Schär et al. (2002) for flow
over a mountain.

To use src_artifdata.f90 and run the model in idealized mode, you have to change the Namelist
parameter lartif_data=.TRUE. in the Namelist group RUNCTL. Then, the namelist ARTIFCTL is
read and the idealized case will be set up and run according to the namelist specifications.

6.1.1 Visualization

The inspection of the results for idealized test cases need vertical cross sections, too. GrADS is not
well suited for this purpose, mainly because it cannot use the height information of the terrain
following coordinate. Our recommendations here are:

• either produce NetCDF-output files and use ncview (see section 5.4).
• Or, if you want to use GrADS (and GRIB-output), then as a workaround you can use the
following methods (see also section 4 for more information).

1. Use the output on z-levels:


define several z-levels (up to 200) in the GRIBOUT-Namelist section
2. Use the output on model levels:
extract either only T, P, U, V fields (on main levels) or only the W field (on half levels)
out of a file <gribfile> (e.g. <gribfile> = lfff01000000) into a new file.

– Either use wgrib. Example:


> wgrib <gribfile> | egrep -i <string> | wgrib -grib -i <gribfile> -o
<gribfile>_uv
and for <string> e.g.
":U:kpds5=33:kpds6=110:|:V:kpds5=34:kpds6=110:"
if you want to extract U and V into a new file with the suffix _uv.
– Or use the GRIB_API. Example:
> grib_copy -w shortName=U,typeOfLevel=hybridLayer lfff00000000 U.grb
> grib_copy -w shortName=V,typeOfLevel=hybridLayer lfff00000000 V.grb
> cat U.grb V.grb > UV.grb
if you want to extract U and V.
Hint: you find the relevant names for the -w-Option by using grib_ls <gribfile> or
grib_dump <gribfile>.

To generate a .ctl file use e.g.:


> grib2ctl lfff01000000_uv > l24.ctl
and then generate an appropriate index-file by
> gribmap -0 -i l24.ctl
(try also gribmap -help).

6.1.2 gnuplot

The COSMO model does not only generate Grib or NetCDF output but also several ascii files. In
particular YUPRMASS and YUPRHUMI contain time series of some variables. It is sometimes useful to
have a graphical impression of those time series. There exist a lot of graphic tools to do that.
gnuplot is a command line plotting tool to, e.g., quickly produce 2D line plots from ascii data files,
but it is also possible to produce more fancy plots like contour, surface or 3D graphs. It is public
domain and available for nearly all possible platforms.

COSMO-Model Tutorial CHAPTER 6. RUNNING IDEALIZED TEST CASES


54 6.2 Exercises

Call it by typing
> gnuplot
Now you can e.g. type
> plot "YUPRMASS" using 1:4
This takes the 1st column of the file YUPRMASS as the x values and the 4th column as the y-values.
You have several possibilities to influence the plotting, e.g. use lines instead of points by
> set style data lines
or have a closer look to the first 50 time steps by
set xrange [1:50].
Type > ? plot to get help and some examples about the plot-command.
Type > ? set to get help about a lot of possible settings for labels, ...
Finish gnuplot by
> quit
Note that gnuplot tries to ignore the comments and text lines in our YUPRMASS and YUPRHUMI as
good as it can. But if you only specify one column, e.g., using 4 instead of using 1:4 to plot the
data from the 4th column (y values) as function of an integer data point index (x values), it will
fail, because this data point index seems to be reset to 0 after each ignored text line.

6.2 Exercises

run_ideal is a job-file containing a documented namelist ARTIFCTL, which contains all possible
namelist parameters for idealized cases in a cook-book-like fashion.
However, to get you started, there are job-files for certain cases with reduced namelists, which
contain only those parameters relevant for the respective case (e.g. run_ideal_hill).
The section A.3.2 in the appendix demonstrates the general framework of these runscripts to facil-
itate their use and own modifications.
If you later feel familiarized with the concept, you can start from the full run_ideal and explore
the various possibilities to define the idealized elements (orography, initial conditions, boundary
conditions (lower, upper, lateral), convection triggers, . . . ). After that, a more advanced topic is
to introduce own modifications into the source file src_artifdata.f90 according to your special
needs.

Mountain flow ’gaussian hill’

- If not already done, compile the model.

- Adapt the runscript run_ideal_hill:

· Choose computing platform.


· Name of the COSMO-model directory, under which the src/-directory is found.
· Name of the executable.
· Name of the output directory.
· Output data format (grib1 or netCDF) by setting yform_write=’ncdf’ or
yform_write=’grb1’.

CHAPTER 6. RUNNING IDEALIZED TEST CASES COSMO-Model Tutorial


6.2 Exercises 55

· Mountain height hill_height.


· Mountain width hill_width_x.
· Inflow velocity u_infty (e.g. also v = 0: atmosphere at rest).

and run tests by typing: > run_ideal_hill

- Have a look to YUSPECIF. Are the NAMELIST-Parameter-Settings really what you wanted to
simulate?

- Have a look to the temporal behaviour of some means in YUPRMASS and YUPRHUMI (you
can visualize the temporal behaviour e.g. with gnuplot).

- In case of grib1-output: Have a look into one of the grib files by


> wgrib lfff01000000 | less ;
extract the field U or W as described above.

- In case of grib1-output, plot vertical cross sections of U, W, T, P and θ with GrADS. (hill.gs
is a very simple script for a plot of U).

• In case of NetCDF output, use ncview for visualization.

Warm bubble, non-linear convection with phase change

Literature: Weisman, Klemp (1982) MWR

- use run_ideal_wk82 and run tests with different inflow velocity profiles
(modify NAMELIST-Parameter u_infty).

- Plot vertical cross sections of W, T, P, θ and the humidity variables QV, QC, QI.

- Plot horizontal cross sections in z = 5000m of W with vector plots of (U,V) (not possible with
ncview and NetCDF output).

Advanced cases

Flow over own ’hill-valley-hill’-combination

- Use job-file run_ideal_hill_2, which is an extension of run_ideal_hill to allow your own


creative modifications. Play around with the number of mountains/valleys, the initial T - and
u-profiles, etc:

· E.g., switch on 3 mountains/hills by setting lhill=.true.,.true.,.true.,.


· Expand the other hill-related namelist variables to corresponding lists of 3 items, e.g.
mountain height hill_height=1000.0,-300.0,500.0, (negative values denote a valley)
mountain position in gridpoints hill_i=80.5,113.5,145.0,
· To change the initial wind profile u(z), set itype_anaprof_uv accordingly (see com-
ments in job-file). Possibilities are constant u itype_anaprof_uv=3, u_infty=... (e.g.
also ~v = 0: atmosphere at rest), or different layers with linear windspeed gradi-
ent itype_anaprof_uv=2 and related other parameters, or Weisman-Klemp-type tanh-
shaped profile itype_anaprof_uv=1, u_infty, href_wk .

COSMO-Model Tutorial CHAPTER 6. RUNNING IDEALIZED TEST CASES


56 6.2 Exercises

· To change the initial profiles T (z) and Qv (z), set itype_anaprof_tqv and the related
namelist parameters according to your wishes (see comments in job-file), e.g., choose a
no-slip or free-slip lower boundary condition by setting lnosurffluxes_m=.false. or
.true.
· The resulting orography is added to the base height href_oro. By choosing href_oro
appropriately, make sure that the resulting orography is not lower than the base height
of the thermodynamic profiles!
· Think what you are doing, it is your responsibility that the chosen combination makes
any sense!

Convective cell moving over a mountain

- Copy run_ideal_hill to e.g. run_ideal_hill_bubble, modify it accordingly:


- Combine the flow over a 2D-hill (a flow-perpendicular dam over the whole model domain) with
the triggering of convection upstream of the mountain. Change from a 2D to a 3D domain
(hint: je_tot, l2dim). Explore hcond_on (time in h of switching on the condensation and
microphysics) to allow the mountain flow to relax against a steady state before triggering
convection by a bubble of type, e.g., "cos-hrd" (heating rate disturbance in a Weisman-
Klemp-shaped bubble during a specified heating period).
- Change from the stable stratification of the mountain wave flow to a potentially un-
stable stratification according to Weisman and Klemp. Hint: cf. run_ideal_wk82 about
itype_anaprof_tqv and itype_anaprof_uv and related parameters, which have to be added
to run_ideal_hill_bubble.
- Choose fixed relaxation boundary conditions in x-direction and periodic boundary conditions
in y-direction by lperi_x=.false., lperi_y=.true.
- Plot vertical cross sections of W, T, P, θ and the humidity variables QV, QC, QI.
- Plot horizontal cross section in z = 2000m of W with vector plots of (U,V) (not possible with
grib1-output).

Mountain flow ’Rippelberg’

Literature: Schär et al. (2002), Mon. Wea. Rev., section 5b

- Change the program code in src_artifdata.f90:


· Program an orography as given in Schär et al. (2002);
hint: look for subroutine add_orography_ana, implement your own case ’schaer’ by
orienting yourself on the case ’gauss-2d-simple’ (hidden in the subroutine) and make
best use of existing namelist parameters like hill_height, hill_width_x.
· Add the possibility to use 65 equidistant levels (z = 300m); choose ivctype=2 ,
zspacing_type=’linear’, and choose combination of zz_top and ke accordingly.
· Choose an isothermal atmosphere.
- Compile the model
- Copy run_ideal_hill to e.g. run_ideal_hill_schaer, modify it accordingly and run the
test.
- Plot vertical cross sections of U, W, T, P and θ as above.

CHAPTER 6. RUNNING IDEALIZED TEST CASES COSMO-Model Tutorial


57

Chapter 7

Troubleshooting for the COSMO-Model

When you work with the software package for the COSMO-Model System, you can have a lot
of trouble. Some of them are due to imperfect (or even missing) documentation (but sometimes
also "not reading carefully"), others are caused by program bugs. We apologize right now, for any
inconvenience this may cause. But there are also troubles resulting from compiler bugs or hardware
problems.

In the following, we want to describe some problems, that can occur during the single phases you
are going through when installing and running the model. If possible, we also want to give some
hints for the solution.

7.1 Compiling and Linking

These are the most common difficulties when compiling and linking the software:

1. First you have to find some proper compiler and linker options, if there are no example Fopts-
files given in the delivered version. This is a job, that we cannot support. You have to take a
look into the manual of your compiler. But we can give one hint: Try a moderate optimization
first! Many compiler problems come from very high optimizations.

2. If you found proper options and can start the compilation, an error message like Unable to
access module symbol file for module netcdf may occur. When you have a library for
NetCDF available and add that to the LIB-definition in the Makefile, you also have to give
a compiler option: -I xxx/include, indicating the path, where the compiled module files for
NetCDF are lying.

3. Other compiler messages could be: "MO_MPI" is specified as the module name on a USE
statement, but the compiler cannot find it (error in module mo_fdbk_io.f90) or "WP"
is used in a constant expression, therefore it must be a constant (error in mod-
ule gscp_data.f90). Then you forgot to specify the pragma -D__COSMO__, which is definitely
necessary since Version 5.01.

COSMO-Model Tutorial CHAPTER 7. TROUBLESHOOTING FOR THE COSMO-MODEL


58 7.2 Troubleshooting in NWP-Mode

4. In the linking process, you can get error messages like:

Unsatisfied external references: .__netcdf_NMOD_nf90_def_dim

or
Undefined symbol .rttov7_rttvi
. Then you forgot to link a certain library, to activate a certain dummy_xxx module in the file
ObjFiles or you unintentionally specified a preprocessor directive.

5. You can get these messages, even if you put all libraries to the LIB Variable, e.g.

#LIB = /rhome/routarz/libdwd/lib/libgrib1.a \
LIB = /uhome/trngxyz/lib/libgrib1.a \
/uhome/trngxyz/lib/libnetcdf.a

Note: Some systems treat all these lines as a comment, because the first line has the comment-
sign and a continuation-sign at the end.

7.2 Troubleshooting in NWP-Mode

For the following examples you need the reference data set. Special run-scripts with Namelist Input
are provided in /e/uhome/fegast3/TRAINING_2016/troubleshooting.

7.2.1 Running INT2LM

1. The most common mistakes, when running INT2LM, are the specification of ICON external
files (when working with ICON subdomain data) or the specification of the COSMO-Model
domain. Also Using grib_api introduced some more possibilities, to make errors. Especially
the handling of the correct environment is mandatory to properly work with grib_api. Un-
fortunately, the error messages are sometimes missleading, or the INT2LM even crashes

EXERCISE:
Take the runscripts run_int2lm_x, modify the directories (if necessary, e.g. IN_DIR,
LM_DIR path and name of the working directory and the binary aprun ...) Try to
run the programs using these scripts. Take a look to the error messages in the batch
output and correct the run-scripts to get proper runs

2. Another problem is demonstrated in the next example. It is tried to nest a domain with a
finer resolution into the output of the reference run. For that exercise you need the output of
the 7 km COSMO-run.

EXERCISE:
Run the INT2LM using the script runeu2de from the directory above. Modify only
the output directory LM_DIR and the path and name of the binary and the actual
working directory. What is the error? And what could be the reason?
Try to use the output file lfff00000000c from the reference data set instead of
cosmo_d5_07000_965x773.g1_2013111400. Run the corrected script for the 12 hour
forecast (is needed in the next exercise).

CHAPTER 7. TROUBLESHOOTING FOR THE COSMO-MODEL COSMO-Model Tutorial


7.2 Troubleshooting in NWP-Mode 59

7.2.2 Running the COSMO-Model

1. Some of the problems of INT2LM you will only notice, when running the COSMO-Model.
This can happen, if the COSMO-Model wants to read certain fields, that are not provided by
the INT2LM.

EXERCISE:
Run the COSMO-Model with the runscript run_cosmo_test_1 from the directory
above. Modify the input directory to use the results from the INT2LM run from the
last exercise. Modify the output directory OUTPUT and the path and the name of the
binary. Take a look to the error message in batch output. What is the reason and
what are possible solutions?

2. Sometime problems may occur, even when you think you set everything perfect.

EXERCISE:
Run the COSMO-Model with the runscript run_cosmo_test_2 from the directory
above. Modify only the output directory OUTPUT and the path and the name of the
binary. Take a look to the error message in the batch output. What is the reason?

There are surely much more reasons for problems and errors, which we cannot discuss all during
these lessons. It really gets troublesome, when a program aborts and writes core-directories. Then
you will need some computer tools (like a debugger) to investigate the problem. In some cases it will
help to increase the debug output by setting idbg_level to a higher value and activating debug
output for special components (with ldebug_xxx). If you cannot figure out, what the reason for
your problem is, we can try to give some support.

Please send all support requests to the mailing list:

[email protected]

COSMO-Model Tutorial CHAPTER 7. TROUBLESHOOTING FOR THE COSMO-MODEL


60 7.3 Troubleshooting in RCM-Mode

7.3 Troubleshooting in RCM-Mode

In the following, some common errors when running INT2LM or COSMO-CLM in climate mode
are listed together with their solution:

• The COSMO-CLM simulation does not start; instead you get an error message saying ***
Error while opening file YUSPECIF ***.
→ This occurs when you have already YU* files in the current directory from a previous
simulation. You need to delete these files before starting a new run.

• Your INT2LM and CCLM run scripts are arranged to use cloud ice as initial and boundary
conditions, but this field is not available in the GCM data. Both parts will crash.
→ In INT2LM, set lprog_qi = .FALSE., in CCLM, set lana_qi = llb_qi = .FALSE..

• You use an external parameter file including 12-monthly vegetation data. INT2LM does not
start an complains that not all necessary input fields can be found
→ You need to set itype_ndvi = 2 in INT2LM.

• You use an external parameter file including maximum and minimum vegetation data.
INT2LM does not start an complains that not all necessary input fields can be found
→ You need to set itype_ndvi = 0 in INT2LM.

• You change your simulation settings from Leapfrog to Runge-Kutta time integration scheme
by setting l2tls = .TRUE., but your run crashes and reports a problem with nboundlines.
→ nboundlines gives the number of lines around each sub-domain which are communicated
among the different processors. For Leapfrog a minimum of 2 boundary lines is necessary; for
Runge-Kutta 3 lines are needed.

• You have prepared an external parameter file with PEP using exactly the domain information
which you intend to use for your CCLM run (namelist &LMGRID in INT2LM), but INT2LM
does not run.
→ For the setup of the data at the boundary (due to interpolation), 2 more lines of data are
required outside the boundary limits. Prepare your PEP file with (at least) 2 more grid points
in each direction.

• Sometimes, the error message of a crashed simulation does not point directly at the problem.
For example, you use the same script and same GCM data with a new INT2LM code version
and get an error in reading the namelist of the input file.
→ It happens sometimes that from one model version to the next a namelist parameter is
skipped or its name has been changed. When you use then an old run script where this
parameter is included, the model gets problems reading the namelists correctly. Solution:
check carefully the changes in the code (files misc.global for general COSMO changes and
README_CHANGES for additional RCM changes) before using a new model version!!

EXERCISE:
You will get a list of ’live’ examples on how to apply the COSMO-CLM for a few different
situations, where you can try running the model yourself and hopefully avoid or manage
the mentionned problems. At the end of the lesson, we will discuss the solutions.

CHAPTER 7. TROUBLESHOOTING FOR THE COSMO-MODEL COSMO-Model Tutorial


61

Chapter 8

Running COSMO-ART

In this lesson you will learn how to run the package for Aerosols and Reactive Trace Gases, COSMO-
ART. Note that we still use the COSMO-Model version 5.01!

8.1 General remarks

COSMO-ART (ART = Aerosols and Reactive Trace Gases, Vogel et al., 2009) is an extension of
the COSMO model that was developed at the Institute for Meteorology and Climate Research
at the Karlsruhe Institute of Technology (KIT). It allows the online calculation of reactive trace
substances and their interaction with the atmosphere. Since model version 4.11 the ART interfaces
are part of the COSMO model. Users who want to apply COSMO-ART should contact Bernhard
Vogel ([email protected]) in order to get the necessary extensions of the COSMO model.

8.2 Source Code and Installation

In the following we will describe the necessary subroutines, the namelist variables, the input data,
and show some exercises.
Now listed are the ART-specific files additional to the COSMO-model:

Gases and Aerosols

Declaration of fields, constants etc.:

art_aerosol_const.f90
art_gas_const.f90
art_gas_const_radm2.f90
art_io_utilities.f90
art_restart.f90
art_species_data.f90
art_utilities.f90
data_cosmo_art.f90

COSMO-Model Tutorial CHAPTER 8. RUNNING COSMO-ART


62 8.2 Source Code and Installation

Emission modules:
art_bvoc_guenther1993.f90
art_bvoc_guenther2012.f90
art_dms.f90
art_dust_shao2004.f90
art_dust_vogel2006.f90
art_dust_vogel2006_orig.f90
art_dust_vogel2006s.f90
art_dust_vogel2006s_orig.f90
art_emiss_param.f90
art_emiss_prescribed.f90
art_frp.f90
art_plumerise.f90
art_seasalt.f90
art_soilnox.f90
Boundary data modules:

art_extpar.f90
art_icbc.f90
Organizing gas phase chemistry and aerosol dynamics:

organize_cosmo_art.f90
Gas phase chemistry, aerosol dynamics:

art_chemkpp.f90
art_deposition.f90
art_papa.f90
art_mademod.f90
art_radm2.f90
art_washout.f90
art_isorropia.f
isrpia.inc
Optical properties of different aerosols:

art_rad_aero.f90
art_rad_dust.f90
art_rad_seas.f90
Aerosol cloud interaction:
art_aci.f90
art_activation.f90
art_nenes_icenuc.f90
parametr.inc

CHAPTER 8. RUNNING COSMO-ART COSMO-Model Tutorial


8.2 Source Code and Installation 63

Volcano:

data_volcano.f90
organize_volcano.f90
volc_emissions.f90
volc_species_data.f90

Pollen:

atab.f90
data_pollen.f90
organize_pollen.f90
pol_emissions.f90
pol_seasons.f90
pol_species_data.f90

COSMO-Model Tutorial CHAPTER 8. RUNNING COSMO-ART


64 8.2 Source Code and Installation

Installation of COSMO-ART (NWP group)

The source code of COSMO-ART is provided in a tar-file together with a special Makefile, and
the files Fopts, Fopts_debug and ObjFiles. Furthermore, the package contains serveral directories
named ART3.0, DOCS, DWD_RUNSCRIPTS_TEMPLATES, LOCAL, LOOKUP_TABLES, obj and src. For the
COSMO-CLM-ART Training 2016, the version 3.1 of ART is provided in
/e/uhome/fegast3/TRAINING_2016/cosmoart/cosmo5.1 − art3.1
for people using the Cray at DWD, based on COSMO 5.1.
To install COSMO-ART, copy the directory cosmo5.1-art3.1 to your local directory.
This directory contains the subdirectory ART3.1, which contains additional subdirectories and files.1
This subdirectories and files in ART3.1 are stated below under the item ART.
COSMO:
Makefile Makefile for compiling and linking.
DOCS/ Directory containing some Documentaion.
DWD_RUNSCRIPTS_TEMPLATES Directory containing default Runscripts.
Fopts This file contains compiler options
Fopts_debug This file contains debug options for the compiler
LOCAL Directory containing several files with compiler options
for varios machines
LOOKUP_TABLES Directory containing Lookup-Tables, required for run-
ning the model.
obj Directory containing the object files of the files in src.
src Directory containing the COSMO source files.
ObjDependencies File containing the list of dependencies for the
COSMO source code.
ObjFiles This is the list of objects Files for COSMO.

ART:
Makefile_Art Makefile for compiling and linking.
src_art Directory containing the ART source files.
src_kpp Directory containing the kpp source files.
ObjDependencies_ART File containing the list of dependencies for the ART
source code.
ObjFiles_ART This is the list of objects Files for ART.
Clean up the obj-directory, because also the COSMO-Code has to be re-compiled now. Copy the
file Makefile_for_ART to your directory cosmo5.1-art3.1 to start the compilation.

- Necessary changes in Fopts:


To activate compilation of COSMO-ART, insert the option -DCOSMOART, and to activate com-
pilation of Pollen, insert the option -DPOLLEN to the compiler call.
- COSMO-ART also contains Fortran 77 subroutines. To handle these routines properly, there
is a special compiler call F77 in Makefile_for_ART.
- Compile the COSMO-Code and the COSMO-ART code with
gmake -f Makefile_for_ART -j 8 artexe.
This creates the binary lmparbin_art
1
For COSMO-ART I/O NETCDF-library is necessary

CHAPTER 8. RUNNING COSMO-ART COSMO-Model Tutorial


8.2 Source Code and Installation 65

NAMELIST-variables for COSMO-ART

The execution of COSMO-ART can be controlled by the following NAMELIST groups:

RUNCTL parameters for the model run


ART parameters for gases and aerosols
VOLC parameters for volcano
POL parameters for pollen

With the following namelist parameters you can control different settings of COSMO-ART. LOG-
ICAL switches begin with an ’l’ and you can turn components on (.TRUE.) and off (.FALSE.).
Parameters beginning with another letter are either INTEGER, REAL or CHARACTER:

&RUNCTL

l_cosmo_art gases and aerosols


l_pollen pollen

&COSMO_ART

lgas gas phase


lkpp gas phase generated with KPP
(Kinetic PreProcessor, test phase)
lgasini initial values for gases available
lgasbd boundary values for gases available
lgas_emiss_in anthropogenic emissions for gases available
lbvoc biogenic emissions for gases available
iart_bvocscheme switch for selecting biogenic VOC emission scheme
(lbvoc must be .TRUE.)
lpoint emissions of point sources available
(ydirin_art) directory of emission data
(ydirin_papa) directory of papa data
laero aerosols (for secondary aerosols lgas must be .TRUE.)
laeroini initial values for aerosols available
laerobd boundary values for aerosols available
laero_emiss_in anthropogenic emissions for aerosols available
ldust emissions of mineral dust (laero must be .TRUE.)
lonly_dust dust is the only transported variable
(laero and ldust must be .TRUE.)
iart_dustscheme select between five dust emission schemes/parameterizations
cwhite_dust select constant for horizontal saltation flux
lsatveg_dust type of vegetation cover for roughness correction
lseas emissions of sea salt (laero must be .TRUE.)
lwash wet deposition of aerosols
lrad_aero submicron aerosol radiation interaction
lrad_dust mineral dust radiation interaction
lrad_seas sea salt radiation interaction
laci_warm aerosol-cloud interaction warm phase

COSMO-Model Tutorial CHAPTER 8. RUNNING COSMO-ART


66 8.2 Source Code and Installation

iaci_cold aerosol-cloud interaction cold phase (four different parameteriza-


tions)
acidelay hour when full aerosol-cloud interaction starts
lfire emissions due to biomass burning (plume rise)
aerostart start hour for aerosols
artstart start hour for ART
hinc_emissions time increment for emissions (h)
hinc_artbounds time increment for boundary data (h)
iart_halo_block length of blocks for ART HALO exchange
lvolcano switch for running volcanic ash

&VOLCCTL

l_ash_species(6) switch for volcanic species

&POLLEN

isp_pollen number of pollen


lpollenini initial values for pollen available
lpollenbd boundary values for pollen available
l_pol_species_d(3) switch for pollen species
emiss_formula_d(3) switch for emission formula

Per default all switches are set to .FALSE., start hours are set to 1.

You can run COSMO-ART with different settings for particles and gases. The following matrix
shows you which switches has to be set for different settings.
l_cosmo_art

lonly_dust

lvolcano
lpollen

Description
laero

lseas

ldust
lgas

Prim. Aerosol x x - - - - - -
Prim. & Sec. Aerosol x x x - - - - -
Gases x - x - - - - -
Only Sea Salt x x - x - - - -
Only Dust x x - - x x - -
Pollen - - - - - - x -
Volcano - - - - - - - x

CHAPTER 8. RUNNING COSMO-ART COSMO-Model Tutorial


8.2 Source Code and Installation 67

Input Data:

To run COSMO-ART the following input data are necessary:

Land Use data:

needed in any way to run the model (also for deposition)


Globcover2009 dataset

Gas phase:

Anthropogenic emissions for different species (area sources, point sources etc.) (if available)
Fire Emission Data for biomass burning emissions
(if available, e.g. Global Fire Monitoring data)

Aerosol particles:

Anthropogenic emissions (if available)


Fire Emission Data for biomass burning emissions
(if available, e.g. Global Fire Monitoring data)

Mineral dust:

Land use data of individual plants


Soil type dataset

BVOC:

Plant Functional Type (PFT) dataset

Volcano:

Emission profile of certain volcano

Pollen:

Land use data of individual plants


These files are created from raw input data by the INT2LM-ART extension of the normal COSMO
preprocessor INT2LM (see section 8.4 for more details), like it is done for meteorological input.
Beside the input data you need the file papa_data.dat for the photolysis. The input files for the
following exercises are provided ready to use.

COSMO-Model Tutorial CHAPTER 8. RUNNING COSMO-ART


68 8.3 Exercises at DWD

8.3 Exercises at DWD

To run the test cases, use the binary lmparbin_art that you created in your local cosmo-directory.
You will find the files, necessary to run the test cases for COSMO-ART in the directory:
/e/uhome/fegast3/TRAINING_2016/cosmoart

You will find all corresponding runfiles in the directory which is named above. The changes in the
runfile you have to make are marked with an "?".
If already prepared input data is used the directories will be given in the corresponding sections.
For your output use your working-directory. The output data will be written in NetCDF. For
a quick visualization of your results you can use ncview. Just use:

ncview l*.nc

8.3.1 Dispersion of idealized area emissions

For the first exercise we use the setup for idealized simulations and simulate the dispersion of gaseous
area emissions.

• Input-directory: /e/uwork/fegast3/INPUT_nataero

• Copy the corresponding runfile run_plume to your local directory.

• Change the directory name for Input, Output, location of lmparbin_art in the runfile and
change the namelist parameters for a simulation only for gases and artificial emissions.

• Use ncview for visualization of your results.

Additionally you can modify the location of the sources in the code:

• Edit the file art_emiss_prescribed.f90 ,

• Search for the Subroutine artif_area_emissions and modify the location of the source.

• Compile and run the job again.

8.3.2 Dispersion of volcanic ash

In case of an volcanic eruption COSMO-ART can be used for the calculation of the dispersion of
volcanic ash.

• Input-directory: /e/uwork/fegast3/INPUT_volc

• Copy the corresponding runfile run_volc to your local directory.

• Change the directory names in your runfile and set the namelist parameters to a simulation
for volcanic ash.

• Use ncview for visualization of your results.

CHAPTER 8. RUNNING COSMO-ART COSMO-Model Tutorial


8.3 Exercises at DWD 69

8.3.3 Emission and dispersion of natural aerosols

During May 2008, which will be simulated in this exercise, mineral dust was transported from the
Saharan desert over Europe. Its effects on the radiative fluxes introduced discernible bias on ground
temperature in the NWP predictions for this period.
Sea salt is an important source of natural aerosols. In the model emissions of sea salt are based on
a wind speed dependend parameterization as shown in this example.

• Input-directory: /e/uwork/fegast3/INPUT_nataero

• Copy the corresponding runfile run_nataero to your local directory

• Adjust your runfile and change namelist parameters for ART for only dust and sea salt
simulation.

• Use ncview for visualization of your results.

8.3.4 Dispersion of anthropogenic aerosols and gases and formation of sec-


ondary aerosol particles

This exercise shows the potential of COSMO-ART, as secondary aerosols are calculated prognosti-
cally based on the concentrations of gaseous components.

• Input-directory: /e/uwork/fegast3/INPUT_anthro

• Copy the corresponding runfile run_anthro to your local directory.

• Adjust your runfile and set the namelist parameters for ART for a simulation for gases and
aerosols.

• Use ncview for visualization of your results.

8.3.5 Full model chain of COSMO-ART and INT2LM-ART

This exercise shows you the complete chain of preparing your input data for COSMO-ART with
INT2LM-ART and applying a nest with higher resolution using initial and boundary data from
a coarser COSMO-ART simulation. In the domain with the finest resolution also aerosol-cloud
interactions will be simulated.
To use the full chain you also have to compile INT2LM-ART. You can get it by copying following
directory to your local directory and compiling it:

/e/uhome/fegast3/TRAINING_2016/cosmoart/int2lm − art

• 1st Input-directory: /e/uwork/fegast3/INPUT_full

• Copy the INT2LM runfile run_int2lm_domain1 to your local directory and adjust this runfile.

• Create the input data for your first domain

• Copy the COSMO-ART runfile run_cosmoart_domain1 to your local directory and adjust it.

• You can use ncview for a quick look at your results.

COSMO-Model Tutorial CHAPTER 8. RUNNING COSMO-ART


70 8.4 INT2LM-ART

• Create your own input data with INT2LM-ART and former produced results to do a nest
within COSMO-ART.

• Copy the runfile run_int2lm_domain2 to your local directory and adjust it. Use the output
of your run_cosmoart_domain1 .

• Copy the COSMO-ART runfile run_cosmoart_domain2 to your local directory and adjust it.

• You can use ncview for a quick look at your results.

• Copy the runfile run_int2lm_domain3 to your local directory and adjust it. Use the output
of your run_cosmoart_domain2 .

• Copy the COSMO-ART runfile run_cosmoart_domain3 to your local directory and adjust it.

• Change the directory names in your runfile and set the namelist parameters for a simulation
for aerosol cloud interaction. Therefore use the settings as before, plus:
&PHYCTL:
itype_gscp = 2483
&COSMO_ART:
laci_warm = .TRUE.
iaci_cold = 4

• Use ncview for visualization of your results.

8.4 INT2LM-ART

The meteorological preprocessor for COSMO - INT2LM (Documentation of COSMO-model, Part


V: Preprocessing) - was designed to interpolate necessary initial and boundary conditions for me-
teorological quantities from a coarse grid onto the model grid in the atmosphere and at the surface
as well as for the soil model. For more general datasets INT2LM was extended by Knote (2012)
to interpolate e.g. aerosol or gaseous species initial and boundary conditions. It is also possible to
merge datasets by adding them or by replacing the coarser dataset on a limited domain by a finer
grid dataset.

This extended preprocessor INT2LM-ART is controled by two logical namelist parameters within
the CONTROL namelist: l_art is the main switch for the INT2LM-ART extension, l_art_nested
indicates, whether initial/boundary conditions from a coarser COSMO-ART run are provided.

Further, a ARTCONTROL and ARTDATASET namelists are necessary. An example for these
can be seen in 1 and 2. A further description of the usage of INT2LM-ART can be found in the
documentation by Knote (2012).

&ARTCONTROL
nart_ds = 1,
/

Namelist example 1: ARTCONTROL

With the following namelist parameters you can control different settings of INT2LM-ART. The
convention about LOGICAL, INTEGER, REAL and CHARACTER is similar to COSMO-ART.
For default values or further descriptions, have a look at the in-code remarks.

CHAPTER 8. RUNNING COSMO-ART COSMO-Model Tutorial


8.4 INT2LM-ART 71

&ARTDATASET
ie_tot=202, je_tot=192, ke_tot=7,
startlon_tot=-20.17, startlat_tot=-15.07,
dlon=0.17, dlat=0.17,
pollon = -170.0, pollat = 43.0, polgam = 0.0,
hinc=1.0, cvertdist=1.0,
lconst_in_time=.FALSE., lconserve_mass=.TRUE.,
ydirin="/mnt/workspc/kno134/int2cosmo/input/emissions/macc_test/",
ylfn_prefix = "macc_", yvarlist= "SO2e","NO2e",
yvertical_axis_type="geometric",
yvertical_method="D", yinterp_type="N", ycombine_action = "O"
/

Namelist example 2: ARTDATASET

&CONTROL
l_art main switch for the ART extension
l_art_nested tracer initial/boundary data is provided from coarser COSMO run
&ARTCONTROL
nart_ds number of tracer input datasets
&ARTDATASET
ie_tot number of grid points in longitude direction
je_tot number of grid points in latitude direction
ke_tot number of vertical levels
startlon_tot longitude of first grid point
startlat_tot latitude of first grid point
dlon longitudinal grid spacing
dlat latitudinal grid spacing
pollat latitude of the rotated north pole
pollon longitude of the rotated north pole
polgam angle between the north poles of the systems
hinc time increment (hrs) for new data
cvertdist(:) percentages of artificial vertical distribution
lconst_in_time data do not vary over time
lconserve_mass try to conserve mass globally on interpolation
ydirin input data path
ylfn_prefix input file name prefix
yvarlist(:) tracers to be read
yvertical_axis_type type of the vertical axis(“geometric”, ”pressure”,
”hyb_sig_pr”)
yvertical_method how to redistribute data vertically (“I” interpolate,
“D”, distribute pressure-weighted, “A” artificially, based on
cvertdist)
yinterp_type interpolation type (“L” bilinear, “N”, nearest neighbour, “A”,
aggregation)
ngrid_refine number of grid refinements
ycombine_action combination of multiple datasets (“A” add, “R” replace, “O”
overwrite)

COSMO-Model Tutorial CHAPTER 8. RUNNING COSMO-ART


72 8.4 INT2LM-ART

CHAPTER 8. RUNNING COSMO-ART COSMO-Model Tutorial


73

Appendix A

The Computer System at DWD

A.1 Available Plattforms at DWD

The NWP exercises this week will be mainly carried out on the CRAY supercomputer system at
DWD. This system has been installed in December 2013 and has been upgraded in December 2014.
It consists of 364 compute nodes based on Intel processor Ivy-Bridge and 432 compute nodes based
on Haswell with corresponding service nodes. Some of the service nodes are the login-nodes (with
names xce00.dwd.de and xce01.dwd.de), on which you can use training accounts.

• xce00, xce01:
These are the login nodes, which run a SUSE Linux Enterprise (SLES) Linux. These nodes
are used for compiling and linking, preparation of data, basic editing work and visualization
of meteorological fields. They are not used for running parallel programs, but jobs can be
submitted to the Cray compute nodes.
• Cray XC30:
The Cray XC30 has 364 compute nodes, each with 2 Intel Iyv-Bridge processors with 10 cores.
Each node such has 20 computational cores. These nodes cannot be accessed interavtively,
but only by batch jobs. Such jobs can use up to 30 GByte of main memory per node, which is
about 1.5 GByte per core. For normal test jobs it should be enough to use less than 10 nodes
(depending on the size of domain chosen).
• Cray XC40:
In December 2014 the machine has been extended by 432 compute nodes with 2 Intel Haswell
processors each. This part of the machine is now called Cray XC40. To use the Haswell
processors, a special compiling environment has to be used. But the exercises this week will
only be carried out on the XC30, which uses the standard compiling environment.

There is a common filesystem across all nodes and every user has 3 different main directories:

• /e/uhome/<username> ($HOME)
Directory for storing source code and scripts to run the model. This is a Panasas-filesystem
suitable for many small files.
• /e/uwork/<username> ($WORK)
Directory for storing larger amounts of data.
• /e/uscratch/<username> ($SCRATCH)
Directory for storing very large amounts of data. For the $WORK and the $SCRATCH filesystem
there is a common quota for every user of 2.5 TByte.

COSMO-Model Tutorial APPENDIX A. THE COMPUTER SYSTEM AT DWD


74 A.2 The Batch System for the Cray

A.2 The Batch System for the Cray

Jobs for the Cray XC30 system have to be submitted on xce00/01 with the batch system PBS.
Together with the source code of the programs we provide some run scripts in which all neces-
sary batch-commands are set. Some of them might have to be adapted. See Section A.3 for more
information on the run scripts.
Here are the most important commands for working with the PBS:

qsub job_name to submit a batch job to PBS. This is done in the run-scripts.
qstat to query the status of all batch jobs on the XC30. You can see
whether jobs are Q (qeued) or R (running). You have to submit jobs
to the queue xc_normal.
qstat -u <user> to query the status of all your batch jobs on the machine.
qdel job_nr to cancel your job(s) from the batch queue of a machine. The job_nr
is given by qstat.

A.3 Cray Run-Scripts for the Programs

The directories for the INT2LM and the COSMO-Model contain run scripts (e.g. runicon2eu and
run_cosmo_eu, resp.) which cat together the NAMELIST-Input of the programs and start the
binaries. A number of exemplary runscripts for real and idealized cases are stored in the subdirectory
RUNSCRIPTS, and the idea is that you copy an appropriate script to a separate directory and modify
it to your needs. The runscripts which you find directly in the model directory are already copies
from the RUNSCRIPTS subfolder.
In case of the COSMO-Model, there are two types of scripts, real cases and idealized cases. The
names of the runscripts for idealized cases start with run_ideal whereas those for real cases do
not contain the word ideal. The scripts for INT2LM are similar to the real case scripts of the
COSMO-model. Idealized simulations do not need preprocessing by INT2LM. Both types of scripts
are described briefly in the following.
These scripts are written for the Cray, but can be easily adapted to other platforms. All scripts work
in a way that they create a batch description job (e.g. int2lm_job or cosmo_eu_job, resp.) which
is submitted to the batch system (the PBS) using the qsub command. These scripts are more or less
only examples. Before running the programs, you have to adapt the scripts / NAMELIST-Input to
your needs.

A.3.1 Scripts for COSMO-model real case simulations and INT2LM

There are five parts in each script:

1. Number of processors
Here you have to change the number of processors, but also some directory names and the
date. Your settings will be transformed to the namelist input below.

NPX, NPY for a parallel program these values can be chosen as needed, for a sequential
program, both values have to be 1.

APPENDIX A. THE COMPUTER SYSTEM AT DWD COSMO-Model Tutorial


A.3 Cray Run-Scripts for the Programs 75

NPIO for a parallel program additional processors can be chosen for doing the I/O
(see also the User Guides); for a sequential program this value has to be 0.
Unless working with very big domains (≥ 800 × 800 grid points), using NPIO
> 0 is not recommended!
NODES this is needed for requesting computational resources. But the user does not
need to specify, it is computed by NPX and NPY.
DATE To specify the initial time of the forecast.

For the INT2LM

LM_EXT To specify the directory with the external parameters for the COSMO-Model
IN_EXT To specify the directory with the external parameters for the driving (input)
model
IN_DIR To specify the directory with the input data from the driving (input) model
LM_DIR To specify the directory with the output data (which will be the input data
for the COSMO-Model).

For the COSMO-Model

INIDIR To specify the directory with the initial data.


BD_DIR To specify the directory with the boundary data.
OUTPUT To specify the directory where the result files are written.
RESTAR To specify the directory where the restart files are written (if any).

2. Batch commands
On the Cray system of DWD you have to start the programs using the PBS (the batch system
runnning on the Cray). This section of the run script sets most of the appropriate values.

#PBS -q xc_normal To put a job to the compute nodes of the XC30.


#PBS -l select=$NODES To specify the number of compute nodes.
#PBS -l place=pack If only one compute node is used.
#PBS -l place=scatter If more than one compute node are used.
#PBS -j oe To put standard error and standard out to the same de-
vice.
#PBS -N To give a special name to the job.

In addition, you have to specify the working directory (RUNDIR), where the job should be
executed. If this directory is not set, the job runs in a temporary directory, but all ASCII
output will be lost then. If a non-existing directory is specified, the job terminates with an
error message. This command tries to run the job in the home-directory of user fegast3.

cd /e/uhome/fegast3/

In some run-scripts also the settings for other machines and batch systems are contained and
can be safely ignored for these tests.

3. Cat together the INPUT*-files


In this part the INPUT-files are created that are needed by the INT2LM and the COSMO-
Model, resp. The user can choose the NAMELIST-variables as needed. For a complete de-
scription of the NAMELIST-variables see Part V (Preprocessing: Initial and Boundary Data
for the COSMO-Model) and Part VII (Users Guide) of the COSMO-Model Documentation.

COSMO-Model Tutorial APPENDIX A. THE COMPUTER SYSTEM AT DWD


76 A.3 Cray Run-Scripts for the Programs

For the purpose of these exercises, the most important Namelist-values for both programs are
specified in the Chapter 2 and 3.

4. Run the programs


This part contains the call to invoke the program. On Cray systems this is

aprun -n <number_of_tasks> -N 20 -j 1 -d 1 -m 1500 name_of_binary.

-n To specify the number of MPI Tasks


-N To specify the number of cores used per node
-j To specify the number of threads used. During these exercises no hybrid
parallelization is used, so this always has to be 1.
-d To specify the number of hyperthreads. This is also not used during these
exercises, so has to be 1.
-m To specify the amount of main memory per core.

name_of_binary can be the name of the program, if the working directory is set correctly,
or you can also specify an absolute path name. Some programs need special environment
variables to be set. This has to be done before invoking the aprun-command.

5. Cleanup
This is to clean up the directory at the end of the run.

The last part in these scripts submits the batch jobs to the machine. For starting a program run,
you just have to type the name of the run-script as a command, e.g.: run_cosmo_eu.

Note: If you start the same script again, all output data from the last model run will be lost! To
preserve the output files for the model fields (lfff∗), you have to change the output directory, and
to preserve the YU∗ files, you have to copy/move them to another place, e.g., the output directory
of the run to which they belong.

APPENDIX A. THE COMPUTER SYSTEM AT DWD COSMO-Model Tutorial


A.3 Cray Run-Scripts for the Programs 77

To summarize, the following skeleton script presents the basic framework of these scripts:

#! / b i n / ksh

#################################################
# number o f p r o c e s s o r s
#################################################

NPX =10
NPY =10
NPIO =0
CPN =20 # c o r e s p e r node
NP1 = ‘ expr ␣ $NPX ␣ \* ␣ $NPY ‘
NP = ‘ expr ␣ $NP1 ␣ + ␣ $NPIO ‘
N1 = ‘ expr ␣ $NP ␣ + ␣ $CPN ␣ -␣ 1 ‘
NODES = ‘ expr ␣ $N1 ␣ \/ ␣ $CPN ‘ # number o f p h y s i c a l nodes

#################################################
# s t a r t d a t e and time
#################################################

DATE =2 01 4 05 0 10 00 0 00

#################################################
# directories
#################################################

RUNDIR =/ e / uhome / fegast3 /


BASE =/ e / uscratch / uschaett / DATA / $ { DATE }
INIDIR = $BASE / CO SMO_DE _input
BD_DIR = $BASE / CO SMO_DE _input
OBSDIR = $BASE / COSMO_DE_obs
RADARI = $BASE / COSMO_DE_obs
OUTPUT = $BASE / CO S MO _D E _o ut p ut
RESTAR = $BASE / CO S MO _D E _o ut p ut

#################################################
# ou tp ut data f o r m a t : ( grb1 , ncdf , o r a p i 2 )
#################################################

FORM = grb1

#============================================================================
# The f o l l o w i n g c a t s t o g e t h e r t h e s c r i p t − f i l e " cosmo_de_job " which i s l a t e r
# s u b m i t t e d t o t h e batch queue . A l l t e x t u n t i l t h e "∗∗∗∗" down below i s
# put i n t o t h a t f i l e . I n t h i s way , we can i n f l u e n c e t h e s c r i p t t e x t , e . g .
# n a m e l i s t p a r a m e t e r s e t t i n g s , by t h e above ( and some own ) s h e l l v a r i a b l e s .
#============================================================================

cat > cosmo_de_job << **** # <−− Here b e g i n s t h e code f o r t h e cosmo_de_job

#################################################
# batch commands
#################################################

#PBS −q normal

...

cd $RUNDIR

...

#################################################
# some c l e a n u p o f o l d f i l e s o f t h e l a s t run
#################################################

rm -f YU * M_ *
rm -f $OUTPUT / l *
rm -f $RESTAR / l *

COSMO-Model Tutorial APPENDIX A. THE COMPUTER SYSTEM AT DWD


78 A.3 Cray Run-Scripts for the Programs

#################################################
# c a t t o g e t h e r t h e INPUT∗− f i l e s ( n a m e l i s t s )
#################################################

cat > INPUT_ORG << end_input_org


& LMGRID
...
end_input_org

cat > INPUT_SAT << end_input_sat


& SATCTL
...
end_input_sat

cat > INPUT_IO << end_input_io


& IOCTL
...
end_input_io

cat > INPUT_DYN << end_input_dyn


& DYNCTL
...
end_input_dyn

cat > INPUT_PHY << end_input_phy


& PHYCTL
...
end_input_phy

cat > INPUT_DIA << end_input_dia


& DIACTL
...
end_input_dia

cat > INPUT_EPS << end_input_eps


& EPSCTL
...
end_input_eps

cat > INPUT_ASS << end_input_ass


& NUDGING
...
end_input_ass

#################################################
# run t h e program
#################################################

...

export L I B D W D _ F O R C E _ C O N T R O L W O R D S =1
export L I B D W D _ B I T M A P _ T Y P E = ASCII
export ...

aprun ... lmparbin_all

...

#################################################
# cleanup
#################################################

rm -f INPUT_ORG INPUT_IO INPUT_DYN INPUT_DIA INPUT_PHY INPUT_INI


rm -f INPUT_ASS INPUT_SAT INPUT_EPS

**** # <−− Here ends t h e code f o r t h e cosmo_de_job

#################################################
# submit t h e j o b t o t h e batch queue
#################################################

qsub cosmo_de_job

APPENDIX A. THE COMPUTER SYSTEM AT DWD COSMO-Model Tutorial


A.3 Cray Run-Scripts for the Programs 79

A.3.2 Scripts for idealized COSMO-model simulations

The scripts for the idealized simulations are somehow different, although the same general philosophy
applies of creating and submitting a job file, which starts the (parallel or serial) program on the
requested resources (nodes, memory). First of all, there is the additional namelist ARTIFCTL in file
INPUT_IDEAL. Also, the first section for user defined parameters is different. Then, there are already
platform dependend script sections for a number of different platforms, where the user can choose
from, or he may add his own platform along those lines.

Finally, the “management” of a model run is different:

1. Create the output directory. If the specified directory already exists, a new one is created,
suffixed by a count number starting from one. For more following consecutive runs, the counter
is increased until the first non-existing directory name is found and created.

2. All files which are necessary to reproduce the run are archived in the output directory.

3. All necessary input files are created directly in the output directory. This is done by the
runscript itself and not by the runscript-created jobscript make_lm_job, whereas for real cases,
this is done by the jobscript cosmo_de_job (see above). Then, the script starts the model run
from the output directory. In the current directory, there are only links to files for the standard
output and standard error of the job.

4. After completion of the model run, all output files, including the YU∗ files and the standard
output and error, can be found in the output directory.

In this way, the parallel and/or consecutive execution of several model runs from the same (maybe
slighty modified) script and the same directory are safely possible without the danger of data loss.
It is however the user’s responsibility to delete the results of unsuccessful runs!

COSMO-Model Tutorial APPENDIX A. THE COMPUTER SYSTEM AT DWD


80 A.3 Cray Run-Scripts for the Programs

To summarize, the following skeleton script presents the basic framework of these scripts:

#! / b i n / bash

#===============================================================================
# BEGIN OF USER PARAMETER SETTINGS
#===============================================================================

#===========================================================================
# Which machine a r e we on ?

machine = ’ cray - xc30 ’

#===========================================================================
# Number o f P r o c e s s o r s ( PEs ) :

# i n X−d i r :
NPX =20
# i n Y−d i r :
NPY =1
# PEs For a s y n c h r o n e o u s IO ( n o r m a l l y 0 ) :
NPIO =0

#===========================================================================
# Root−d i r e c t o r y o f t h e model s r c −code ( f o r backup ) .
# This i s t h e master d i r e c t o r y o f t h e model s o u r c e , i . e ,
# where t h e d i r e c t o r i e s s r c / and o b j / r e s i d e .

lmcodedir = $ ( pwd )/..

#===========================================================================
# Names o f t h e e x e c u t a b l e and f i l e s f o r s t d o u t / s t d e r r :

lmexecutable = $ { lmcodedir }/ tstlm_f90


lmoutput = lm . out
lmerrput = lm . err
j o bn a m e _ i n _ q u e u e = lmtest

#===========================================================================
# model s t a r t time ( p r o v i d e dummy time i n c a s e o f i d e a l i z e d r u n s )
# i n Format YYYYMMDDHH o r YYYYMMDDHHMMSS

mod elstar ttime =2003032100

#===========================================================================
# Output d i r e c t o r y :

outputdir = $WORK / COSMO / IDEAL_HILL

# NOTE: I f t h e ou tp ut d i r e c t o r y e x i s t s , t h e name w i l l be i n c r e m e n t e d by a
# c o u n t e r t o a v o i d unwanted data l o s s e s !

#===========================================================================
# I n p u t d i r e c t o r y ( o n l y needed f o r i d e a l i z e d r e s t a r t r u n s ) :

inputdir = $WORK / COSMO / IDEAL_HILL_IN

#===========================================================================
# i f i t y p e _ a r t i f p r o f i l e s = 1 ( n a m e l i s t ARTIFCTL) t h e f o l l o w i n g r a d i o s o n d e f i l e
# w i l l be used f o r i n i t i a l p r o f i l e s :

rasopfad = $ ( pwd )
rasodatei = $ { rasopfad }/ r a so _w k _q 1 4_ u0 5 . dat

#===========================================================================
# Machine dependent s e t t i n g s :

if [ $ { machine } = sx9dwd ]
then
...
elif [ $ { machine } = sx8dwd ]
then

APPENDIX A. THE COMPUTER SYSTEM AT DWD COSMO-Model Tutorial


A.3 Cray Run-Scripts for the Programs 81

...
elif [ $ { machine } = ibmham ]
then
...
elif [ $ { machine } = xc2kit ]
then
...
elif [ $ { machine } = cray - xc30 ]
then
...
elif [ $ { machine } = localpc ]
then
...
fi

#===============================================================================
# END OF USER PARAMETER SETTINGS ! ! ! NOW MODIFY THE NAMELISTS BELOW ! ! !
#===============================================================================

#######################################################
# P r e p a r a t i o n : S e t up LMDIR and c h e c k o t h e r p a r a m e t e r s :
#######################################################

LMDIR = $ ( pwd )

# I f t h e o ut pu t d i r e c t o r y e x i s t s , i n c r e m e n t t h e name by a c o u n t e r t o
# a v o i d unwanted data l o s s e s :

< some code > ...

jobdatei = $outputdir / make_lm_job

ln - sf $ { outputdir }/ $ { lmoutput } $ { LMDIR }/ $ { lmoutput }


ln - sf $ { outputdir }/ $ { lmerrput } $ { LMDIR }/ $ { lmerrput }

####################################################
# copy n e c e s s a r y s t a r t f i l e s t o t h e o ut pu t d i r e c t o r y
####################################################

cp $ { rasodatei } $ { outputdir }

#################################################
# change t o o u t p u t d i r , p r e p a r e t h e INPUT f i l e s
# and s t a r t t h e model run
#################################################

cd $ { outputdir }

#################################################
# Batch system commands d e p e n d i n g on machine
#################################################

if [ $ { machine } = sx9dwd -o $ { machine } = sx8dwd ]


then
...
elif [ $ { machine } = cray - xc30 ]
then

# S t a r t s c r i p t o f t h e p a r a l l e l j o b s ( e . g . , " mpirun " , " s t a r t m p i " , . . . )


if [ $NODES - gt 1 ]
then
par starts cript = " aprun ␣ -n ␣ $ { NP } ␣ -N ␣ $ { task s_per_ node } ␣ -j ␣ 1 ␣ -d ␣ 1 ␣ -m ␣ $ { maxmemory } "
else
par starts cript = " aprun ␣ -n ␣ $ { NP } ␣ -m ␣ $ { maxmemory } "
fi

cat > $jobdatei << markee

#! / u s r / b i n / ksh
#PBS −q $ { j o b k l a s s e }
...

markee

COSMO-Model Tutorial APPENDIX A. THE COMPUTER SYSTEM AT DWD


82 A.3 Cray Run-Scripts for the Programs

elif [ $ { machine } = ibmham ]


then
...
elif [ $ { machine } = xc2kit ]
then
...
fi

#################################################
# some c l e a n u p o f o l d f i l e s o f t h e l a s t run
#################################################

rm -f YU * M_ *

#################################################
# c a t t o g e t h e r t h e INPUT∗− f i l e s ( n a m e l i s t s )
#################################################

cat > INPUT_ORG << end_input_org


& LMGRID
...
end_input_org

...

cat > INPUT_IDEAL << e n d _ i n p u t _ a r t i f c t l


& ARTIFCTL
...
end_input_artifctl

#################################################
# batch submit commands d e p e n d i n g on machine
#################################################

if [ $ { machine } = sx9dwd -o $ { machine } = sx8dwd ]


then
...
elif [ $ { machine } = cray - xc30 ]
then

cat >> $jobdatei << marke

cd $ { outputdir }

#################################################
# run t h e program
#################################################

...
export O M P_ NU M _T HR E AD S =1
...

$ { parsta rtscri pt } $ { lmexecutable }

marke

#################################################
# s a v e some f i l e s documenting t h e model run
# ( s r c / ∗ , M a k e f i l e , Fopts , Obj ∗ ) t o t h e ou tp ut
# d i r e c t o r y and submit j o b :
#################################################

sichere_src # This s a v e s t h e f i l e s

qsub $jobdatei # This s u b m i t s t h e j o b

elif ...
...
fi

APPENDIX A. THE COMPUTER SYSTEM AT DWD COSMO-Model Tutorial


83

Appendix B

The Computer System at DKRZ

B.1 Available Plattforms at DKRZ

The climate simulation exercises will be carried out on the supercomputer at the DKRZ in Hamburg.
It is called "mistral" with a peak performance of 1.4 PetaFLOPS consists of 1,500 compute nodes,
36,000 compute cores, 120 Terabytes of memory, and 20 Petabytes of disk.
The computer system is accessible at: mistral.dkrz.de
The file system structure is as follows:

• /pf/[a,b,k,m,u,g]/<username> ($HOME)
Directory for storing source code and scripts to run the model (16 GB quota per user, backup
provided).

• /work/<project>/<username>
Directory for storing larger amounts of data, which are accessed frequently (quota per project).

• /scratch/[a,b,k,m,u,g]/<username> ($SCRATCH)
Directory for storing large amounts of data which are only needed for a short time. Files will
be deleted frequently (generally after 14 days and after 7 days when space is limited).

The project ID for this years’ training course is bb0721, thus the disk space in
/work/bb0721/<username>/ can be used for the exercises.
Note that the mistral system should only be used for compilation and job submission.

B.2 The Batch System for the mistral

On the mistral, the SLURM system is used for the submission of batch jobs. Generally, only few
options have to be changed in the SLURM commands for a new job.
Here are the most important commands for working with SLURM:

sbatch job_name to submit a batch job. In the step-by-step exercises, this command
has to be used to submit the job scripts; for the chain environment,
this is included in the run-scripts.

COSMO-Model Tutorial APPENDIX B. THE COMPUTER SYSTEM AT DKRZ


84 B.3 Script environment for running the COSMO-CLM in a chain

squeue -u username to query the status of all your batch jobs. You can see in the column
ST whether your jobs are waiting (I) or running (R).

scancel job_nr to cancel your job(s) from the batch queue of a machine. The job
number job_nr is given by squeue.

B.3 Script environment for running the COSMO-CLM in a chain

Inside the chain directory cclm-sp/chain, you will find a number of files and directories:

arch/ Directory for archived simulation output.

gcm_to_cclm/ Directory where you run a downscaling simulation with gcm data as forcing.

cclm_to_cclm/ Directory where you run a nested simulation with cclm data as forcing.

scratch/ Directory where data for input and output of the chain scripts are temporally stored.

work/ Directory where you find the log files of your simulation, joboutputs, restart files and sim-
ulation output of the post-processing job.

B.3.1 The scripts

To use the computer facilities effectively, the whole job chain is not carried out in just one script,
but divided into five scripts to perform the main sub-tasks of the cclm chain:

prep.job.tmpl
All necessary preprocessing is performed by this script, e.g., copying boundary data from
the archive and transforming them into int2lm compatible netCDF format. In the standard
template copying is performed from hard disk. However, the script can be changed to perform
copying from tape or via ftp or scp.

int2lm.job.tmpl
The job to run INT2LM. For detailed information about the INT2LM namlists see Chapter
2.

cclm.job.tmpl
The job to run CCLM. For detailed information about the CCLM namlists see Chapter 3.

arch.job.tmpl
Archiving of the results is performed in this script. In the standard template archiving is
performed on hard disk. However, the script can be changed to perform archiving on tape or
via ftp or scp.

post.job.tmpl
Any post processing needed is performed here. Archiving of postprocessing results is also done
here. Post-processing may take a lot of time, depending on its complexity. This may slow down
the whole chain. Therefore it is run in parallel to the rest of the chain.

and one main script to rule them all:

APPENDIX B. THE COMPUTER SYSTEM AT DKRZ COSMO-Model Tutorial


B.3 Script environment for running the COSMO-CLM in a chain 85

subchain
This main script prepares the environment variables for the other scripts, builds *.job files
for the actual month in the chain from *.tmpl files and submit the actual *.job files.

Before you start your simulation, change the scripts to your needs. Fig. B.1 shows the job flow of
these scripts.

Please keep in mind that you may have only a limited number of serial jobs you can run at the
same time (on the DKRZ mistral the limit is 10 serial jobs). To prevent your chain to be slowed
down, save two serial jobs for the chain since post can run at the same time as cclm and the one
month ahead prep and int2lm.

Changes to be made in the file job_settings:

EXPID an experiment identification, alphanumeric characters

PFDIR the basis directory for the different experiments (has to exist)

WORKDIR a working directory on the work filesystem for log output (has to
exist)

SCRATCHDIR a working directory on the scratch filesystem for intermediate out-


put (has to exist)

ARCHIVE_OUTDIR a working directory on the work filesystem for storing the final model
output (has to exist)

YDATE_START/YDATE_STOP start and end date of the simulation

EMAIL_ADDRESS is needed if the mailx commands are used and if notification is


set to something else than never in the LoadLeveler commands.

UTILS_BINDIR the path where the auxiliary program cfu can be found

INT2LM_EXE This variable can be found a little further down in the script and
gives the name and full path of the INT2LM executable.

CCLM_EXE This variable can be found a little further down in the script and
gives the name and full path of the COSMO-CLM executable.

The files in the templates subdirectory need only to be changed when other namelist parameters
than those in the global settings of the subchain script have to be adapted. There is one parameter,
however, which probably has to be changed in the int2lm.job.tmpl: this is the path and name of
the external data, ylmext_cat and ylmext_lfn (or further variables if a totally different parameter
file is used). Also, be careful that in yinext_lfn (name of coarse model input data files) you may
use cas instead of caf depending on the coarse data.

The structure of the templates for the int2lm and cclm batch jobs contain three parts, similar to
the parts 2-4 described for the NWP model system (section A.3):

1. Batch commands:
Here, SLURM commands have to be set, such as the number of nodes and tasks per node the
job should use (parameters startting with @ are set in the subchain script or in the
tt job_settings file:

COSMO-Model Tutorial APPENDIX B. THE COMPUTER SYSTEM AT DKRZ


86 B.3 Script environment for running the COSMO-CLM in a chain

#SBATCH --nodes=@{NODES}
#SBATCH --ntasks=@{NP}

A guess for the wall clock time of the job:

#SBATCH --time=00-00:30:00

the jobname

# @ job_name = int2lm_@{EXPID}

and the logfiles which will be written:

# @ output = @{JOBLOGFILE}\%j
# @ error = @{JOBLOGFILE}\%j

Your email adress in order to get informations about e.g. errors:

#SBATCH --mail-user=@{EMAIL_ADDRESS}

The account that should be charged:

#SBATCH --account=@{PROJECT_ACCOUNT}

2. Cat together the INPUT*-Files:


This part contains all the namelist parameters which are set in the INT2LM and COSMO-
CLM model. Those parameters which are not set here will be assigned to their default values.
For a complete description of the NAMELIST-variables see Part V (Preprocessing: Initial and
Boundary Data for the COSMO-Model) and Part VII (Users Guide) of the COSMO-Model
Documentation.
3. Run the programs:
The call for running the model in parallel mode on the mistral is srun.

B.3.2 The environment variables

The cclm chain distinguishes between four types of environment variables defined in the scripts
subchain:

constant global
these are used in at least two of the scripts and keep the same value throughout the whole sim-
ulation. They are defined at the beginning in the script subchain. Examples: basis directories,
experiment id, grid description ($PFDIR, $EXPID, ...)
time dependent global
these are used in at least two of the scripts and change their values during the simulation.
They are defined at the beginning of the script subchain. Examples: current month ($CUR-
RENT_DATE)
constant local
these are used just by one of the scripts and keep the same value throughout the whole
simulation. Actually they do not have to be defined at all, the values can be put directly
into the script. However, it may become nicer and you can minimize errors if the same value
has not to be changed at different places. Examples: number of nodes, processors per node,
processors ($NODES, $PPN, $NP)

APPENDIX B. THE COMPUTER SYSTEM AT DKRZ COSMO-Model Tutorial


B.3 Script environment for running the COSMO-CLM in a chain 87

time dependent local


these are either used just by one of the scripts or may have different values in different scripts.
They change their values during the simulation. Examples: the logfiles ($JOBLOGFILE)

In the template scripts these environment variables are written with a @ instead of an $ as first
character (e.g. @EXPID instead of $EXPID) to make clear that these will be replaced via the sed
command by the subchain script.

B.3.3 Extensions

In the directory $SPDIR/chain/extensions several extensions for a chain job are available. In the
README files you find instructions how to implement these extensions.

Alternative archive format

Directory: $SPDIR/chain/extensions/archive_format. In the template post.job.tmpl the results


are combined into tar-files for archiving. You can choose between alternative templates, however,
you are free to change post.job.tmpl to your needs.

Boundary Data

Directory: $SPDIR/chain/extensions/boundary_data. Settings for alternative global reanalyses


and GCM are stored here. Presently settings for the following global model data are available:
ERA40, ERAInterim, NCEP-RA1

SAMOA

Directory: $SPDIR/chain/extensions/samoa.
SAMOA (SAnity check for MOdels of the Atmosphere) is a script developed at the Karlsruhe
Institute of Technology (KIT) 1 in order to perform a basic check of COSMO, COSMO-CLM and
COSMO-ART model results. When the model writes out large amounts of variables over long period
of times it is not possible anymore to check all the output manually. SAMOA allows detecting major
flaws in the result and preventing archiving such results. The script is based on CDO (for further
dependencies check the header of the script) and checks for each variable in the output if all the
values are in a range provided in a list (samoa.list in SAMO subfolder). If a variable is in the model
output but not in samoa.list or if its name has changed SAMOA provides a warning. In that case
samoa.list must be changed accordingly. What SAMOA does NOT do: In the case that SAMOA
does not give an error, this does not mean that your results are correct. Neither physical consistency
check for different variables nor checks for biases are performed. As the name of the script implies,
this is only a sanity check tool and the user still needs to validate the results after checking them
with SAMOA.
1
The script is currently maintained at KIT. In case you have a suggestion for improvement or you find a bug
please feel free to contact KIT.

COSMO-Model Tutorial APPENDIX B. THE COMPUTER SYSTEM AT DKRZ


88 B.3 Script environment for running the COSMO-CLM in a chain

B.3.4 Running the CCLM chain

The steps for running the COSMO-CLM chain are:

1. Create a directory ${PFDIR}/${EXPID} by hand (all others will be created automatically by


the scripts). Copy the contents of gcm_to_cclm or cclm_to_cclm into that directory

cp -R whateverdir/chain/<input>_to_cclm/* ${PFDIR}/${EXPID}

2. Change the scripts to your needs

3. start the chain by typing

./subchain start

If the job chain crashes you can start the chain again with any of the scripts by typing

./subchain <scriptname>

For prep (except at the very beginning of the chain) and int2lm you have to provide a second argu-
ment with the date for which the script should be started. The date cannot be taken automatically
from the file date.log since prep and int2lm can also be run for one month ahead of the other
scripts.

If the cclm simulation is successful you can find the output data under the directories
$SPDIR/chain/arch/sp001 and as post-processed time series under $SPDIR/chain/work/sp001/post.

B.3.5 The directory structure

Fig. B.2 shows the directory structure of the CCLM chain simulation. By default the SCRATCH
directory will be totally removed after job completion. The basic archive input and output directories
have to be defined by the user at the beginning of subchain (see above), and further special
treatments in the files prep.job.tmpl and post.job.tmpl, respectively.

B.3.6 Remarks

• INC_DATE=MM:DD:HH specifies the time increment for the CCLM job (the other parts of the
chain are always monthly !!). Valid values are 01:00:00, 00:01:00, 00:02:00, ... , 00:27:00.

• Do not change any files in the directory jobs. This will have no effect, since these files will
be overwritten by subchain with the files from the directory templates. If you want to make
changes always apply these to the files in templates.

• It is assumend that not just the login node can submit jobs but also the compute nodes. If
this is not the case on your computer system you need to find a work around (e.g. by using
the nqwait command on the NEC-SX8).

More information can also be found in the documentation of the Starter Package.

APPENDIX B. THE COMPUTER SYSTEM AT DKRZ COSMO-Model Tutorial


B.3 Script environment for running the COSMO-CLM in a chain 89

Figure B.1: Job Flow of the CCLM subchain scripts

COSMO-Model Tutorial APPENDIX B. THE COMPUTER SYSTEM AT DKRZ


90 B.3 Script environment for running the COSMO-CLM in a chain

Figure B.2: Directory structure of the CCLM subchain scripts

APPENDIX B. THE COMPUTER SYSTEM AT DKRZ COSMO-Model Tutorial


91

Appendix C

Necessary Files and Data for the NWP


Mode

C.1 Availability of Source Codes and Data

To do the installation and perform some test runs, the following files and data are needed. As a
registered user you can download the data from DWD’s ftp-server:

Address: ftp-incoming.dwd.de, User: feu3.

For this course you will find all necessary data on the xce00/01 in the directory

/e/uhome/fegast3/TRAINING_2016/subdirectories.

Name Description Directory


DWD-libgrib1_110128.tar.gz GRIB1-library source
int2lm_150611_2.02.tar.bz2 INT2LM source code source
cosmo_151202_5.03.tar.bz2 COSMO-Model source code source
reference_data_5.03.tar.bz2 reference data set data
cosmo_dn_rrrrr_iiixjjj.g1_date external parameters for the topo
COSMO-Model domain
icon_extpar_0026_R03B07_G_20141202.nc external parameters for global ICON topo
icon_extpar_<region>_R03B07_20141202.nc external parameters for a special topo
region
icon_grid_0026_R03B07_G.nc external grid file for global ICON topo
icon_grid_<region>_R03B07.nc external grid file for a special region topo
icon_hhl_0026_R03B07_G_20141202.g2 external HHL file for global ICON topo
icon_hhl_<region>_R03B07_20141202.nc external HHL file for a special region topo
invar.i384a external parameters for GME topo

COSMO-Model Tutorial APPENDIX C. NECESSARY FILES AND DATA FOR THE NWP MODE
92 C.2 External Parameters

The names of the external parameters for special COSMO domains usually contain the name of the
region (dn), the resolution (rrrrr: either in meters or in degrees) and the grid point size (iiixjjj).
The external parameter files available for this course are listed in Sect. C.2.
The tar-files with the source code you should copy to your home directory ($HOME), the data to the
work ($WORK) or the scratch ($SCRATCH) directory, if necessary. You can also access the data (which
are not in tar files) directly from the directory given above. The data you create with your tests
should also be written to your work ($WORK) or the scratch ($SCRATCH) directory.

C.2 External Parameters

For this Training we provide special data sets for Europe, Africa, Asia and South America. Only
for Europe we provide several resolutions, for Africa, Asia and South America we only provide data
sets with 7 km resolution.
Europe
The following table lists the external parameter files available for Europe. These parameters are
created for a rotated grid with pollat=40.0 and pollon=-170.0.

Data Set Grid Size Degrees Meters


cosmo_d0_02800_1605x1605.g1_2013111400 1605 × 1605 0.025 2800
cosmo_d5_07000_965x773.g1_2013111400 965 × 773 0.0625 7000
cosmo_d5_14000_483x387.g1_2013111400 483 × 387 0.125 14000

Figure C.1 shows the areas for these domains. Note that the domain for d0 is smaller than the one
for d5.

Figure C.1: Areas for the domains d0 and d5

APPENDIX C. NECESSARY FILES AND DATA FOR THE NWP MODE COSMO-Model Tutorial
C.2 External Parameters 93

Africa

For Africa we provide the file cosmo_africa_0.0625_2256x1617.g1. It has been created for 7 km
resolution and has a grid size of 2256 × 1617 grid points. There is no grid rotation and the rotated
pole therefore must be specified with pollat=90.0 and pollon=-180.0.

The ICON data provided are for a smaller domain, which is illustrated with a red rectangle. COSMO
domains must be specified within this red rectangle.

Figure C.2: External parameters for Africa

COSMO-Model Tutorial APPENDIX C. NECESSARY FILES AND DATA FOR THE NWP MODE
94 C.2 External Parameters

Asia

For Asia we provide the file cosmo_asia_0.0625_1921x1601.g1. It has been created for 7 km
resolution and has a grid size of 1921 × 1601 grid points. There is no grid rotation and the rotated
pole therefore must be specified with pollat=90.0 and pollon=-180.0.

Figure C.3: External parameters for Asia

APPENDIX C. NECESSARY FILES AND DATA FOR THE NWP MODE COSMO-Model Tutorial
C.2 External Parameters 95

South America

For South America we provide the file cosmo_southamerica_0.0625_1233x1233.g1. It has been


created for 7 km resolution and has a grid size of 1233 × 1233 grid points. There is no grid rotation
and the rotated pole therefore must be specified with pollat=90.0 and pollon=-180.0.

The ICON data provided are for a smaller domain, which is illustrated with a red rectangle. COSMO
domains must be specified within this red rectangle.

Figure C.4: External parameters for South America

COSMO-Model Tutorial APPENDIX C. NECESSARY FILES AND DATA FOR THE NWP MODE
96 C.3 Driving Data

External parameter data sets can be generated for any domain on the earth. Depending on the raw
data sets available up to now, the highest resolution possible is about 2 km (0.02 degrees). At DWD
there is a tool EXTPAR for creating the external parameters. The CLM-Community has a slightly
different tool PEP (Preparation of External Parameters), which can use additional raw data sets.
More information on this issue can be found in the COSMO-Model documentation.

C.3 Driving Data

3-hourly data are provided for the four regions Europe, Africa, Asia and South America. More
information on these data will be given during the Training.

APPENDIX C. NECESSARY FILES AND DATA FOR THE NWP MODE COSMO-Model Tutorial
97

Appendix D

Necessary Files and Data for the


Climate Mode

For the climate simulation, all the necessary code and input files to start working with the COSMO-
CLM are included in the ’starter package’ which is available on the CLM-Community web page.
Copy this file to your work directory (/work/bb0721/<userid>/ on mistral). After unpacking the
archive (tar -xzvf cclm-sp.tgz) you find 6 directories:

src Source code for INT2LM, COSMO-CLM, the GRIB-Library and the auxil-
iary program package CFU

data Data fields for initial, boundary and external parameters; Initial and bound-
ary data are provided for two months (January and February 2000) from
NCEP/NCAR reanalysis data.

step_by_step seperate run scripts for a sample INT2LM and COSMO-CLM simulation,
with two versions provided: one (subdirectory gcm_to_cclm) for an inter-
polation from coarse model input to COSMO-CLM and one (subdirectory
cclm_to_cclm) for a nesting from COSMO-CLM to COSMO-CLM

chain run scripts for the same sample INT2LM and COSMO-CLM simulations but
with the complete chain script environment for long-term simulations.

other_systems adoptions of the starter package for other computer systems. If you want to
use these, please replace the files of the IBM Power6 version accordingly.

docs documentation of the starter package.

The starter package is written for the mistral at DKRZ. Additional changes need to be applied
on other machines. This is noted in the text for those cases we already know. The examples are
built in a way that everything will be stored under the directory tree starting with SPDIR (create
the help environment variable with export SPDIR=$PWD/cclm-sp in the directory where you have
unpacked the starter package). Additionally, you will find a script init.sh in the main directory
which initalizes the package by replacing the current SPDIR in the scripts (it has to be called once
after unpacking the starter package). Running all examples and keeping all files will need about 50
GB disk space.

COSMO-Model Tutorial APPENDIX D. NECESSARY FILES AND DATA FOR THE CLIMATE MODE
98 D.1 The Source Codes

D.1 The Source Codes

For this training course we use the new model code which has been recently unified from latest
NWP and RCM model developments:

int2lm_2.0_clm1
cosmo_5.0

D.2 External Parameters

For the 0.44◦ grid spacing we are using in this course, we provide the external parameter file (in the
subdirectory data/ext/). The grid size of this file is rlon = 105 and rlat = 115 with a rotated
pole at pollat = 39.25 and pollon = -162.0.
Figure D.1 shows the area of this domain.

Figure D.1: Area for the domain of the provided external parameters (colours show the surface
height in m).

External parameter data sets for any other domain and resolution can be generated by a tool called
PEP (Preparation of External Parameters). This can be found on the CLM-Community home page.
Generally, also the external parameters data sets which are provided by the DWD for NWP can be
used (see Appendix C).

D.3 Initial and Boundary Data

For this course we will use the NCEP/NCAR reanalysis data as global model input. These data can
be downloaded from the NCEP/NCAR homepage and are freely available for scientific applications.

APPENDIX D. NECESSARY FILES AND DATA FOR THE CLIMATE MODE COSMO-Model Tutorial
D.4 Preprocessor and model system overview 99

On the mistral computer in Hamburg, a preprocessed version (ready to use for INT2LM) of the
NCEP data is stored in the directory /work/coast/prj1/reanalysis/NCEP1/.

The data are available at 6-hourly interval for the years 1948 – 2013. For this training course, we have
selected two months of data: January and February 2000. These can be found in the subdirectory
data/gcm/ncep/2000_mm with mm= 01 and 02.

D.4 Preprocessor and model system overview

The type of global model input which can be used for COSMO-CLM and the structure of the total
workflow, as well as the name convention of intermediate files is shown in Figure D.2.

Figure D.2: Model chain structure of the COSMO-CLM.

COSMO-Model Tutorial APPENDIX D. NECESSARY FILES AND DATA FOR THE CLIMATE MODE
100 D.4 Preprocessor and model system overview

APPENDIX D. NECESSARY FILES AND DATA FOR THE CLIMATE MODE COSMO-Model Tutorial
GOOD LUCK

and

H AVE FUN

You might also like