Cosp2 2018

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

Geosci. Model Dev.

, 11, 77–81, 2018


https://doi.org/10.5194/gmd-11-77-2018
© Author(s) 2018. This work is distributed under
the Creative Commons Attribution 4.0 License.

The Cloud Feedback Model Intercomparison Project Observational


Simulator Package: Version 2
Dustin J. Swales1,2 , Robert Pincus1,2 , and Alejandro Bodas-Salcedo3
1 Cooperative Institute for Research in Environmental Sciences, University of Colorado Boulder, Boulder, Colorado, USA
2 NOAA/Earth System Research Laboratory, Boulder, Colorado, USA
3 Met Office Hadley Centre, Exeter, UK

Correspondence: Dustin J. Swales ([email protected])

Received: 22 June 2017 – Discussion started: 8 August 2017


Revised: 6 November 2017 – Accepted: 23 November 2017 – Published: 9 January 2018

Abstract. The Cloud Feedback Model Intercomparison But such a comparison is not straightforward. The most
Project Observational Simulator Package (COSP) gathers to- comprehensive views of clouds are provided by satellite re-
gether a collection of observation proxies or “satellite sim- mote sensing observations. Comparisons to these observa-
ulators” that translate model-simulated cloud properties to tions are hampered by the large discrepancy between the
synthetic observations as would be obtained by a range of model representation, as profiles of bulk macro- and micro-
satellite observing systems. This paper introduces COSP2, physical cloud properties, and the information available in
an evolution focusing on more explicit and consistent sepa- the observations which may, for example, be sensitive only to
ration between host model, coupling infrastructure, and indi- column-integrated properties or be subject to sampling issues
vidual observing proxies. Revisions also enhance flexibility caused by limited measurement sensitivity or signal attenu-
by allowing for model-specific representation of sub-grid- ation. To make comparisons more robust, the Cloud Feed-
scale cloudiness, provide greater clarity by clearly separat- back Model Intercomparison Project (CFMIP, https://www.
ing tasks, support greater use of shared code and data includ- earthsystemcog.org/projects/cfmip/) has led efforts to apply
ing shared inputs across simulators, and follow more uniform observation proxies or “instrument simulators” to climate
software standards to simplify implementation across a wide model simulations made in support of the Climate Model In-
range of platforms. The complete package including a testing tercomparison Project (CMIP) and CFMIP.
suite is freely available. Instrument simulators are diagnostic tools that map the
model state into synthetic observations. The ISCCP (Interna-
tional Satellite Cloud Climatology Project) simulator (Klein
and Jakob, 1999; Webb et al., 2001), for example, maps
a specific representation of cloudiness to aggregated esti-
1 A common language for clouds mates of cloud-top pressure and optical thickness as would
be provided by a particular satellite observing program, ac-
The most recent revision to the protocols for the Coupled counting for sampling artifacts such as the masking of high
Model Intercomparison Project (Eyring et al., 2016) includes clouds by low clouds and providing statistical summaries
a set of four experiments for the Diagnosis, Evaluation, and computed in precise analogy to the observational datasets.
Characterization of Klima (Climate). As the name implies, Subsequent efforts have produced simulators for other pas-
one intent of these experiments is to evaluate model fields sive instruments that include the Multi-angle Imaging Spec-
against observations, especially in simulations in which sea- troRadiometer (MISR: Marchand and Ackerman, 2010) and
surface temperatures are prescribed to follow historical ob- Moderate Resolution Imaging Spectroradiometer (MODIS;
servations. Such an evaluation is particularly important for Pincus et al., 2012) and for the active platforms Cloud-
clouds since these are a primary control on the Earth’s radia- Aerosol Lidar and Infrared Pathfinder Satellite Observation
tion budget.

Published by Copernicus Publications on behalf of the European Geosciences Union.


78 D. J. Swales et al.: The Cloud Feedback Model Intercomparison Project

(CALIPSO; Chepfer et al., 2008) and CloudSat (Haynes across models would often require sometimes quite substan-
et al., 2007). tial code changes to maintain consistency between COSP1
Some climate models participating in the initial phase of and the host model.
CFMIP provided results from the ISCCP simulator. To ease The satellite observations COSP emulates are derived
the way for adoption of multiple simulators, CFMIP orga- from individual observations made on spatial scales of or-
nized the development of the Observation Simulator Pack- der kilometers (for active sensors, tens of meters) and sta-
age (COSP; Bodas-Salcedo et al., 2011). A complete list of tistically summarized at ∼ 100 km scales commensurate
the instrument simulator diagnostics available in COSP1, and with model predictions. To represent this scale bridging, the
also in COSP2, can be found in Bodas-Salcedo et al. The ISSCP simulator introduced the idea of subcolumns – dis-
initial implementation, hereafter COSP1, supported more crete, homogenous samples constructed so that a large en-
widespread and thorough diagnostic output requested as part semble reproduces the profile of bulk cloud properties within
of the second phase of CFMIP associated with CMIP5 (Tay- a model grid column and any overlap assumptions made
lor et al., 2012). Similar but somewhat broader requests are about vertical structure. COSP1 inherited the specific meth-
made as part of CFMIP3 (Webb et al., 2017) and CMIP6 ods for generating subcolumns from the ISCCP simulator, in-
(Eyring et al., 2016). cluding a fixed set of inputs (convective and stratiform cloud
The view of model clouds enabled by COSP has enabled fractions, visible-wavelength optical thickness for ice and
important advances. Results from COSP have been useful liquid, mid-infrared emissivity) describing the distribution of
in identifying biases in the distribution of model-simulated cloudiness. Models for which this description was not appro-
clouds within individual models (Kay et al., 2012; Nam and priate, for example a model in which more than one category
Quaas, 2012), across the collection of models participating of ice was considered in the radiation calculation (Kay et al.,
in coordinated experiments (Nam et al., 2012), and across 2012), had to make extensive changes to COSP if the diag-
model generations (Klein et al., 2013). Combined results nostics were to be informative.
from active and passive sensors have highlighted tensions be- The fixed set of inputs limited models’ ability to remain
tween process fidelity and the ability of models to reproduce consistent with the radiation calculations. Many global mod-
historical warming (Suzuki et al., 2013), while synthetic ob- els now use the Monte Carlo independent column approxi-
servations from the CALIPSO simulator have demonstrated mation (Pincus et al., 2003) to represent subgrid-scale cloud
how changes in vertical structure may provide the most ro- variability in radiation calculations. Inspired by the ISCCP
bust measure of climate change on clouds (Chepfer et al., simulator, McICA randomly assigns subcolumns to spectral
2014). Results from the ISCCP simulator have been used intervals, replacing a two-dimensional integral over cloud
to estimate cloud feedbacks and adjustments (Zelinka et al., state and wavelength with a Monte Carlo sample. Models us-
2013) through the use of radiative kernels (Zelinka et al., ing McICA for radiation calculations must implement meth-
2012). ods for generating subcolumns, and the inability to share
COSP1 simplified the implementation of multiple simu- these calculations between radiation and diagnostic calcula-
lators within climate models but treated many components, tions was neither efficient nor self-consistent.
especially the underlying simulators contributed by a range COSP1 was effective in packaging together a set of sim-
of collaborators, as inviolate. After most of a decade this ap- ulators developed independently and without coordination,
proach was showing its age, as we detail in the next section. but this had its costs. COSP1 contains three independent rou-
Section 3 describes details the conceptual model underlying tines for computing joint histograms, for example. Simula-
a new implementation of COSP and a design that addresses tors required inputs, some closely related (relative and spe-
these issues. Section 4 provides some details regarding im- cific humidity, for example), and produced arbitrary mixes of
plementation. Section 5 contains a summary of COSP2 and outputs at the column and subcolumn scale, making multi-
provides information about obtaining and building the soft- sensor analyses difficult.
ware.

3 A conceptual model and the resulting design


2 Barriers to consistency, efficiency, and extensibility
Though the division was not always apparent in COSP1, all
Especially in the context of cloud feedbacks, diagnostic in- satellite simulators perform four discrete tasks within each
formation about clouds is most helpful when it is consis- column:
tent with the radiative fluxes to which the model is subject.
COSP2 primarily seeks to address a range of difficulties that 1. sampling of cloud properties to create homogenous sub-
arose in maintaining this consistency in COSP1 as the pack- columns;
age became used in an increasingly wide range of models.
For example, as COSP1 was implemented in a handful of 2. mapping of cloud physical properties (e.g., condensate
models, it became clear that differing cloud microphysics concentrations and particle sizes) to relevant optical

Geosci. Model Dev., 11, 77–81, 2018 www.geosci-model-dev.net/11/77/2018/


D. J. Swales et al.: The Cloud Feedback Model Intercomparison Project 79

Subcolumn
simulator outputs
(e.g., observed reflectivity,
estimated total optical Column Column
Subcolumn Subcolumn simulator outputs
thickness, etc.) simulators
Column inputs simulators
inputs
p (z)
τ (z)

q (z)
w0 (z)
t (z)

Figure 1. Organizational view of COSP2. Within each grid cell host models provide a range of physical inputs at the grid scale (grey
ovals, one profile per variable) and optical properties at the cloud scale (green circles, Nsubcol profiles per variable). Individual subcolumn
simulators (lens shapes, colored to indicate simulator types) produce Nsubcol synthetic retrievals (squares) which are then summarized by
aggregation routines (funnel shapes) taking input from one or more subcolumn simulators.

properties (optical depth, single scattering albedo, radar ple, by the ISCCP simulator to mimic the retrieval of cloud-
reflectivity, etc.); top pressure from infrared brightness temperature.
Simulators within COSP2 are explicitly divided into two
3. synthetic retrievals of individual observations (e.g., pro-
components (Fig. 1). The subcolumn simulators, shown as
files of attenuated lidar backscatter or cloud-top pres-
lenses with colors representing the sensor being mimicked,
sure/column optical thickness pairs); and
take a range of column inputs (ovals) and subcolumn inputs
4. statistical summarization (e.g., appropriate averaging or (circles, with stacks representing multiple samples) and pro-
computation of histograms). duce synthetic retrievals on the subcolumn scale, shown as
stacks of squares. Column simulators, drawn as funnels, re-
The first two steps require detailed knowledge as to how duce these subcolumn synthetic retrievals to statistical sum-
a host model represents cloud physical properties; the last maries (hexagons). Column simulators may summarize in-
two steps mimic the observational process. formation from a single observing system, as indicated by
The design of COSP2 reflects this conceptual model. The shared colors. Other column simulators may synthesize sub-
primary inputs to COSP2 are subcolumns of optical proper- column retrievals from multiple sources, as suggested by the
ties (i.e., the result of step 2 above), and it is the host model’s black funnel.
responsibility to generate subcolumns and map physical to This division mirrors the processing of satellite obser-
optical properties consistent with model formulation. This vations by space agencies. At NASA, for example, these
choice allows models to leverage infrastructure for radia- processing steps correspond to the production of Level 2
tion codes using McICA, making radiation and diagnostic and Level 3 data, respectively. Implementation required the
calculations consistent with one another. Just as with pre- restructuring of many of the component simulators from
vious versions of COSP, using subcolumns is only neces- COSP1. This allowed for modest code simplification by us-
sary for models with coarser resolutions (e.g., GCMs) and ing common routines to make statistical calculations.
for high-resolution models (e.g., cloud-resolving models); Separating the computation of optical properties from the
model columns can be provided directly to COSP2. The in- description of individual simulators allows for modestly in-
strument simulator components were reorganized to elimi- creased efficiency because inputs shared across simulators,
nate any internal dependencies on the host model, and sub- for example the 0.67 µm optical depth required by the ISCCP,
sequently on a model scale. COSP2 also requires as input MODIS, and MISR simulators, do not need to be recomputed
a small set of column-scale quantities including surface prop- or copied. The division also allowed us to make some simula-
erties and thermodynamic profiles. These are used, for exam-

www.geosci-model-dev.net/11/77/2018/ Geosci. Model Dev., 11, 77–81, 2018


80 D. J. Swales et al.: The Cloud Feedback Model Intercomparison Project

tors more generic. In particular, the CloudSat simulator used 3. Parameterized precision for all REAL variables
by COSP is based on the Quickbeam package (Haynes et al., (KIND = wp), where the value of wp can be set in a sin-
2007). Quickbeam is quite generic with respect to radar fre- gle location to correspond to 32 or 64 byte real values.
quency and the location of a sensor, but this flexibility was
lost in COSP1. COSP2 exposes the generic nature of the un- 4. Explicit INTENT for all subroutine arguments.
derlying subcolumn lidar and radar simulators and introduces
configuration variables that provide instrument-specific in- 5. Standardization of vertical ordering for arrays in which
formation to the subcolumn calculation. the top of the domain is index 1.

6. Conformity to Fortran 2003 standards.


4 Implementation
COSP2 must also be explicitly initialized before use. The ini-
4.1 Interface and control flow tialization routine calls routines for each simulator in turn.
This allows for more flexible updating of ancillary data such
The simplest call to COSP now makes use of three Fortran- as lookup tables.
derived types representing the column and subcolumn in-
puts and the desired outputs. The components of these types
are PUBLIC (that is, accessible by user code) and are, with 5 Summary
few exceptions, pointers to appropriately dimensioned ar-
rays. COSP determines which subcolumn and column sim- Version 2 of the CFMIP Observational Simulator Package,
ulators are to be run based on the allocation status of these COSP2, represents a substantial revision of the COSP plat-
arrays, as described below. All required subcolumn simula- form. The primary goal was to allow a more flexible rep-
tors are invoked, followed by all subcolumn simulators. Op- resentation of clouds, so that the diagnostics produced by
tional arguments can be provided to restrict work to a subset COSP can be fully consistent with radiation calculations
of the provided domain (set of columns) to limit memory use. made by the host model, even in the face of increasingly com-
COSP2 has no explicit way of controlling which simu- plex descriptions of cloud macro- and micro-physical prop-
lators are to be invoked. Instead, column simulators are in- erties. Consistency requires that host models generate sub-
voked if space for one or more outputs is allocated – that is, columns and compute optical properties, so that the interface
if one or more of the output variables (themselves compo- to the host model is entirely revised relative to COSP1. As an
nents of the output-derived type) are associated with array example and a bridge to past efforts, COSP2 includes an op-
memory of the correct shape. The set of column simulators tional layer that provides compatibility with COSP 1.4.1 (the
determines which subcolumn simulators are to be run. Not version to be used for CFMIP3), accepting the same inputs
providing the inputs to these subcolumn simulators is an er- and implementing sampling and optical property calculations
ror. in the same way.
The use of derived types allows COSP’s capabilities to Simulators in COSP2 are divided into those that compute
be expanded incrementally. Adding a new simulator, for ex- subcolumn (pixel) scale synthetic retrievals and those that
ample, requires adding new components to the derived type compute column (grid) scale statistical summaries. This dis-
representing inputs and outputs, but codes referring to ex- tinction, and the use of extensible derived types in the inter-
isting components of those types need not be changed. This face to the host model, are designed to make it easier to ex-
functionality is already in use – the output fields available in tend COSP’s capabilities by adding new simulators at either
COSP2 extend COSP1’s capabilities to include the joint his- scale, including analysis making use of observations from
tograms of optical thickness and effective radius requested as multiple sources.
part of CFMIP3.

4.2 Enhancing portability Code availability. The source code for COSP2, along with down-
loading and installation instructions, is available in a GitHub repos-
COSP2 also includes a range of changes aimed at provid- itory (https://github.com/CFMIP/COSPv2.0). Previous versions of
ing more robust, portable, and/or flexible code, many of COSP (e.g., v1.3.1, v1.3.2, v1.4.0 and v1.4.1) are available in a par-
allel repository (https://github.com/CFMIP/COSPv1). But these
which were suggested by one or more modeling centers us-
versions have reached the end of their life, and COSP2 provides
ing COSP. These include the following. the basis for future development. Models updating or implement-
1. Robust error checking, implemented as a single routine ing COSP, or developers wishing to add new capabilities, are best
which validates array shapes and physical bounds on served by starting with COSP2.
values.
2. Error reporting standardized to return strings, where Competing interests. The authors declare that they have no conflict
non-null values indicate failure. of interest.

Geosci. Model Dev., 11, 77–81, 2018 www.geosci-model-dev.net/11/77/2018/


D. J. Swales et al.: The Cloud Feedback Model Intercomparison Project 81

Acknowledgements. The authors thank the COSP Project Man- Marchand, R. and Ackerman, T. P.: An analysis of cloud cover in
agement Committee for guidance and Tomoo Ogura for testing Multiscale Modeling Framework Global Climate Model Simula-
the implementation of COSP2 in the MIROC climate model. tions using 4 and 1 km horizontal grids, J. Geophys. Res., 115,
Dustin Swales and Robert Pincus were financially supported D16207, https://doi.org/10.1029/2009JD013423, 2010.
by NASA under award NNX14AF17G. Alejandro Bodas- Nam, C. C. W. and Quaas, J.: Evaluation of clouds and pre-
Salcedo received funding from the IS-ENES2 project, European cipitation in the ECHAM5 general circulation model using
FP7-INFRASTRUCTURES-2012-1 call (grant agreement 312979). CALIPSO and CloudSat satellite data, J. Climate, 25, 4975–
4992, https://doi.org/10.1175/JCLI-D-11-00347.1, 2012.
Edited by: Klaus Gierens Nam, C., Bony, S., Dufresne, J.-L., and Chepfer, H.:
Reviewed by: Bastian Kern and one anonymous referee The “too few, too bright” tropical low-cloud problem
in CMIP5 models, Geophys. Res. Lett., 39, L21801,
https://doi.org/10.1029/2012GL053421, 2012.
Pincus, R., Barker, H. W., and Morcrette, J.-J.: A fast, flex-
References ible, approximate technique for computing radiative transfer
in inhomogeneous cloud fields, J. Geophys. Res., 108, 4376,
Bodas-Salcedo, A., Webb, M. J., Bony, S., Chepfer, H., https://doi.org/10.1029/2002JD003322, 2003.
Dufrense, J. L., Klein, S. A., Zhang, Y., Marchand, R., Pincus, R., Platnick, S., Ackerman, S. A., Hemler, R. S., and Hof-
Haynes, J. M., Pincus, R., and John, V.: COSP: satellite simula- mann, R. J. P.: Reconciling simulated and observed views of
tion software for model assessment, B. Am. Meteorol. Soc., 92, clouds: MODIS, ISCCP, and the limits of instrument simulators,
1023–1043, https://doi.org/10.1175/2011BAMS2856.1, 2011. J. Climate, 25, 4699–4720, https://doi.org/10.1175/JCLI-D-11-
Chepfer, H., Bony, S., Winker, D., Chiriaco, M., Dufresne, J.-L., 00267.1, 2012.
and Seze, G.: Use of CALIPSO lidar observations to evaluate the Suzuki, K., Golaz, J.-C., and Stephens, G. L.: Evaluating cloud tun-
cloudiness simulated by a climate model, Geophys. Res. Lett., ing in a climate model with satellite observations, Geophys. Res.
35, L15704, https://doi.org/10.1029/2008GL034207, 2008. Lett., 40, 4464–4468, https://doi.org/10.1002/grl.50874, 2013.
Chepfer, H., Noel, V., Winker, D., and Chiriaco, M.: Where Taylor, K. E., Stouffer, R. J., and Meehl, G. A.: An overview of
and when will we observe cloud changes due to cli- CMIP5 and the experiment design, B. Am. Meteorol. Soc., 93,
mate warming?, Geophys. Res. Lett., 41, 8387–8395, 485–498, https://doi.org/10.1175/BAMS-D-11-00094.1, 2012.
https://doi.org/10.1002/2014GL061792, 2014. Webb, M. J., Andrews, T., Bodas-Salcedo, A., Bony, S., Brether-
Eyring, V., Bony, S., Meehl, G. A., Senior, C. A., Stevens, B., ton, C. S., Chadwick, R., Chepfer, H., Douville, H., Good, P.,
Stouffer, R. J., and Taylor, K. E.: Overview of the Coupled Kay, J. E., Klein, S. A., Marchand, R., Medeiros, B.,
Model Intercomparison Project Phase 6 (CMIP6) experimen- Siebesma, A. P., Skinner, C. B., Stevens, B., Tselioudis, G.,
tal design and organization, Geosci. Model Dev., 9, 1937–1958, Tsushima, Y., and Watanabe, M.: The Cloud Feedback Model In-
https://doi.org/10.5194/gmd-9-1937-2016, 2016. tercomparison Project (CFMIP) contribution to CMIP6, Geosci.
Haynes, J. M., Marchand, R., Luo, Z., Bodas-Salcedo, A., Model Dev., 10, 359–384, https://doi.org/10.5194/gmd-10-359-
and Stephens, G. L.: A multipurpose radar simulation pack- 2017, 2017.
age: QuickBeam, B. Am. Meteorol. Soc., 88, 1723–1727, Webb, M. J., Senior, C., Bony, S., and Morcrette, J.-J.: Combining
https://doi.org/10.1175/BAMS-88-11-1723, 2007. ERBE and ISCCP data to assess clouds in the Hadley Centre,
Kay, J. E., Hillman, B. R., Klein, S. A., Zhang, Y., Medeiros, B. P., ECMWF and LMD atmospheric climate models, Clim. Dynam.,
Pincus, R., Gettelman, A., Eaton, B., Boyle, J., Marchand, R., 17, 905–922, https://doi.org/10.1007/s003820100157, 2001.
and Ackerman, T. P.: Exposing global cloud biases in the Com- Zelinka, M. D., Klein, S. A., and Hartmann, D. L.: Comput-
munity Atmosphere Model (CAM) using satellite observations ing and partitioning cloud feedbacks using cloud property his-
and their corresponding instrument simulators, J. Climate, 25, tograms. Part I: Cloud radiative kernels, J. Climate, 25, 3715–
5190–5207, https://doi.org/10.1175/JCLI-D-11-00469.1, 2012. 3735, https://doi.org/10.1175/JCLI-D-11-00248.1, 2012.
Klein, S. A. and Jakob, C.: Validation and sensitivities of Zelinka, M. D., Klein, S. A., Taylor, K. E., Andrews, T.,
frontal clouds simulated by the ECMWF model, Mon. Webb, M. J., Gregory, J. M., and Forster, P. M.: Con-
Weather Rev., 127, 2514–2531, https://doi.org/10.1175/1520- tributions of different cloud types to feedbacks and
0493(1999)127<2514:VASOFC>2.0.CO;2, 1999. rapid adjustments in CMIP5, J. Climate, 26, 5007–5027,
Klein, S. A., Zhang, Y., Zelinka, M. D., Pincus, R., Boyle, J., and https://doi.org/10.1175/JCLI-D-12-00555.1, 2013.
Gleckler, P. J.: Are climate model simulations of clouds improv-
ing? An evaluation using the ISCCP simulator, J. Geophys. Res.,
118, 1329–1342, https://doi.org/10.1002/jgrd.50141, 2013.

www.geosci-model-dev.net/11/77/2018/ Geosci. Model Dev., 11, 77–81, 2018

You might also like