XMM ABC Guide
XMM ABC Guide
AN INTRODUCTION TO
XMM-NEWTON DATA ANALYSIS
Version 2.01
23 July 2004
Copies of this guide are available in html, postscript and pdf formats.
Contents
1 Introduction 1
1.1 ACKNOWLEDGMENTS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
3 Data 5
3.1 USEFUL DOCUMENTATION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
3.2 THE DATA . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
3.3 PI Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
3.3.1 ODF Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
3.3.2 Pipeline Product Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
i
ii
5.4 SOURCE DETECTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
5.5 TIMING ANALYSIS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
5.6 ONCE MORE, THIS TIME WITH FEELING AND FTOOLS . . . . . . . . . . . . . . . . . . . 32
5.7 ODF DATA . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
5.7.1 Rerunning the EPIC Chains . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
5.8 A More-or-Less Complete Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
1 List of Acronyms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iv
iii
iv
Introduction
The purpose of this ABC Guide to XMM-Newton data analysis is to provide a simple walk-through of basic data
extraction and analysis tasks. Also included is a guide to references and help available to aid in the analysis
of the data. We have tried to balance providing enough information to give the user a useful introduction to
a variety of analysis tasks with not providing too much information, which would make a guide like this too
ponderous to use. As such, there is no intention to replace the SAS Handbook, which should be considered the
highest authority for the use of SAS. Therefore this document will not display the full versatility of the SAS
tasks, and of SAS itself, but it will hopefully show a path through the forest.
Chapter 2 provides lists of web-based references for the XMM-Newton project, help desks, analysis guides,
and science and calibration data. Chapter 3 provides a description of the data files provided for observation
data sets. Chapter 4 discusses the installation and use of SAS. Chapters 5, 6, and 7 discuss the analysis of
EPIC, RGS, and OM data respectively.
This document will continue to evolve. Updated versions will be made available on our web site at:
http://heasarc.gsfc.nasa.gov/docs/xmm/abc/
1.1 ACKNOWLEDGMENTS
This guide would not have been possible without the help and comments from all people involved in the XMM-
Newton project. In particular, we would like to thank Giuseppe Vacanti and Julian Osborne whose comments
made this a more complete and accurate document.
IMH wishes to thank all the OM calibration team and in particular Antonio Talavera, Matteo Guainazzi
and Bing Chen for their help in the preparation of this and other documents related to the OM.
SLS wishes to thank Dave Lumb, Richard Saxton, and Steve Sembay for their helpful insights into EPIC
data analysis.
1
Chapter 2
2
3
Long-Term Timeline:
http://xmm.vilspa.esa.es/external/xmm sched/advance plan.shtml
2.6 SOFTWARE
XMM-Newton Standard Analysis System (SAS):
http://xmm.vilspa.esa.es/external/xmm sw cal/sas frame.shtml
HEASARC HEASoft Package:
http://heasarc.gsfc.nasa.gov/docs/corp/software.html
CXC CIAO Package:
http://asc.harvard.edu/ciao/
Data
5
6
3.3 PI Data
Proprietary XMM-Newton data is available for download via your XSA account. Email instructions from SOC
at Vilspa are sent to the address on record with detailed directions on how to retrieve your data via the XSA.
The data files can be considered to come in two groups in separate subdirectories when retrieved, the
Observation Data Files (ODF) files and Pipeline Processing (PPS) files. The ODF data contain all of the
observation-specific data necessary for reprocessing the observation. The PP data contain, among other things,
calibrated photon event files and source lists.
NOTE: For observation data sets going to US PIs, the GSFC GOF makes the data available in two
directories containing the following groups of files.
NOTE: For SAS processing, the file names should contain all upper case characters. However, at least
with early CDs, the file names used lower case characters. The GSFC XMM-Newton GOF provides a script to
rename the files.
PPiiiiiijjkkAAAAAA000 0.HTM
iiiiiijjkk observation number
AAAAAA group identifier (see Table 3.1)
7
Group ID Contents
CRSCOR1 Contains PDF files of POSS II finding charts, HTML files of cross correlations
with the SIMBAD data base, FITS tables for the detected sources
EANCIL1 Contains the exposure maps in a variety of energy bands and the source-detection
sensitivity maps for the EPIC instruments. The sensitivities are in units of
counts s1 corrected for vignetting and corresponding to a likelihood specified
in the FITS header. The files are gzipped with a .FTZ extension.
EEVLIS1 Contains calibrated photon event files for the EPIC detectors. If the files are
sufficiently large they may be separated into two tar files. The files are gzipped
fits files with a .FTZ extension.
ESKYIM1 This group contains the event images in a variety of energy bands. The fits files
are gzipped with a .FTZ extension, the full images also come as PNG images.
ESRLIS1 Contains EPIC observation source lists. There is an HTML page of the merged
source list and gzipped fits tables of source lists from the different instruments
and source detection tasks.
PPSDAT Contains the Calibration Index File (CIF) used in the pipeline processing (*CALIND*),
PPS information, and the attitude history time series (*ATTTSR*) in gzipped FITS
or ASCII format.
PPSGRA Contains the OM tracking history plots, PPS, EPIC, OM, RGS observation, and PPS
run summaries. NOTE: CHECK THESE OUT
PPSMSG ASCII file containing pipeline processing report
REVLIS3 Contains the RGS source and event lists in gzipped FITS format
REXPIM3 Contains the RGS exposure maps in gzipped FITS format
RIMAGE3 Contains the RGS images (both energy dispersion and cross dispersion) in gzipped
FITS and PNG formats
RSPECT3 Contains the RGS source and background spectra in gzipped FITS and PDF formats
1
Further information on the files can be found in Table 5.1.
2
Further information on the files can be found in Table 7.1.
3
Further information on the files can be found in Table 6.1.
Chapter 4
4.1 INSTALLATION
The primary guide for the installation of SAS can be found through the SOC at
http://xmm.vilspa.esa.es/external/xmm sw cal/sas frame.shtml (note that the final / is often required
for SOC pages). Because of the complexity of the SAS installation, it is strongly recommended that users
download and install the binary executables rather than compiling SAS from source code (which also necessitates
the purchase of commercial software). It should also be noted that optional components, while not needed for
running SAS tasks from the command-line, are critical to running SAS from the GUI. These optional components
are listed at the SOC page http://xmm.vilspa.esa.es/sas/installation/requirements.shtml.
8
9
Sets the SAS directory path
source $SAS_DIR/sas-setup.csh Initializes SAS
$SAS_DIR/sas-setup.sh Alternate SAS initialization
setenv SAS_ODF /path/to/odf_data Sets the directory path to the ODF data, it is
probably a good idea to have this be the
full path.
setenv SAS_CCFPATH /path/to/CCF Sets the directory path to the CCF data
setenv SAS_CCF $SAS_ODF/ccf.cif Sets the Calibration Index File (CIF) path and
file name (note that the CIF file is normally
part of an event list, so SAS_CCF can also
be pointed at the list, this should probably
be the full path as well
setenv SAS_VERBOSITY 3 Sets the verbosity, 1 => little, 10 => lot
setenv SAS_SUPPRESS_WARNING 3 Sets the warning level, 1 => little, 10 => lot
sas & Invokes the SAS GUI, SAS tasks can also be run
on the command line
NOTE: To verify the SAS-specific settings, use the task sasversion (alternatively, the command env | grep
SAS can be used).
SAS need not be run in the directory where the data are stored (for example, it will be possible to run
off of the data CDs when the file names are changed to be upper case). To do so only requires that the setenv
SAS CCF $SAS ODF/ccf.cif be reset to the directory string for the working directory, see 4.5.1. This can also
be done using the SAS Preferences GUI (found under the File menu). From the command line invocation of
tasks the input and output directories, when relevant, can be set as parameters (e.g., see command line input
for odfingest, 4.5.2).
SAS tasks can be run equally well from the command line and from the SAS GUI. In this document we
will demonstrate the use of most tasks from the command line. In many cases parameters where the default
values are acceptable are not included in the command list, which can be done in practice as well. If the GUI
interface is being used then simply set the parameters there.
The MPE Analysis Guide, http://wave.xray.mpe.mpg.de/xmm/data analysis demonstrates many of
the common tasks using GUIs.
4.5.1 cifbuild
Many SAS tasks require calibration information from the Calibration Access Layer (CAL). Relevant files
are accessed from the set of Current Calibration File (CCF) data using a CCF Index File (CIF). A CIF
is included in the pipeline products but if the CCF has been updated it can be recreated by the user. In
practice, it is perhaps easiest to determine whether the CCF has been updated by recreating the CIF us-
ing the SAS task cifbuild (default name ccf.cif) and then using the SAS task cifdiff to compare the new
CIF with the old. If the CAL has changed the user may want to reprocess the data using the new CIF
(e.g., see 5.7.1). To help determine whether it is reasonable to reprocess the data, the CCF release notes
(http://xmm.vilspa.esa.es/user/calib top.html) should be examined.
CCF files can be downloaded directly from the SOC web site (see 4.2)
WARNING: The CIF file contains a list of files to be used in the calibration/processing of your data. The
task cifbuild looks at the CCF directory and builds the CIF file accordingly. If the data are processed with
two different CIF files (e.g., because they were generated at different times, with different files under the CCF
directory) you can end up with different results (although most often not significantly different). Note that the
pipeline product *CALIND* is the CIF file used for the pipeline processing.
To run cifbuild and cifdiff on the command line use:
4.5.2 odfingest
The task odfingest extends the Observation Data File (ODF) summary file with data extracted from the instru-
ment housekeeping data files and the calibration database. It is required for reprocessing the ODF data with
the pipeline tasks as well as for many other tasks.
To run odfingest on the command line use:
So, youve received an XMM-Newton EPIC data set. What are you going to do with it? After checking what
the observation consists of (see 3.2), you can start with the Pipeline Processed data. As noted in Chapter 4,
a variety of analysis packages can be used for the following steps. However, as the SAS was designed for the
basic reduction and analysis of XMM-Newton data (extraction of spatial, spectral, and temporal data), it will
be used here for demonstration purposes (although see 5.6 for a short tutorial on the use of Xselect for data
extraction). SAS will be required at any rate for the production of detector response files (RMFs and ARFs)
and other observatory-specific requirements. (Although for the simple case of on-axis point sources the canned
response files provided by the SOC can be used.)
NOTE: For PN observations with very bright sources, out-of-time events can provide a serious contami-
nation of the image. Out-of-time events occur because the read-out period for the CCDs can be up to 6.3%
of the frame time. Since events that occur during the read-out period cant be distinguished from others events,
they are included in the event files but have invalid locations. For observations with bright sources, this can
cause bright stripes in the image along the CCD read-out direction. For a more detailed description of this
issue, check: http://wave.xray.mpe.mpg.de/xmm/cookbook/EPIC PN/ootevents.html
PPiiiiiijjkkAAX000SUMMAR0000.HTM, where
iiiiii proposal number
jj target ID - target number in proposal
kk exposure ID - exposure number for target
NOTE: The ten-digit combination of iiiiiijjkk is the observation number and is used repet-
itively throughout the file nomenclature
AA ID (EP EPIC Summary, OM Optical Monitor Summary, RG RGS Summary OB
Observation Summary, OB with SUMMAR > PPSSUM Pipeline Processing Summary, CA with
SUMMAR > XCORRE Source Correlation Summary)
Each grouping of the pipeline products (Tables 3.1 and 5.1) there is an HTML (.HTM extension) file which
lists the associated files and gives a few-word description of those files. It is useful to set up your web browser to
automatically display a number of file types, e.g., PDF files. The HTML file names are of the following format:
12
13
jj target ID - target number in proposal
kk exposure ID - exposure number for target
AAAAAA Group ID (Table 5.1)
The data file names are of the form (see Table 41 in the XMM Data Files Handbook,
ftp://xmm.vilspa.esa.es/pub/odf/data/sv/docs/datafileshb 2 0.pdf.gz, or *.ps.gz):
PiiiiiijjkkaablllCCCCCCnmmm.zzz, where
iiiiiijjkk observation number
aa detector, M1 MOS1, M2 MOS2, PN PN, CA for files from the CRSCOR group
b S for scheduled observation, U for unscheduled, X for files from the CRSCOR group (and any
product that is not due to a single exposure)
lll exposure number
CCCCCC file identification (Table 5.1)
n exposure map band number, unimportant otherwise for EPIC data
mmm source number in hexidecimal
zzz file type (e.g., PDF, PNG, FTZ, HTM)
ASC ASCII file, use Netscape, other web browser, or the more command
ASZ gzipped ASCII file
FTZ gzipped FITS format, use ds9, Ximage, Xselect, fv
HTM HTML file, use Netscape or other web browser
PDF Portable Data Format, use Acrobat Reader
PNG Portable Networks Graphics file, use Netscape or other web browser
TAR TAR file
ESKYIM IMAGE 8 Sky image 0.2 - 12.0 keV Zipped FITS ds9, Ximage, fv
IMAGE 1 Sky image 0.2 - 0.5 keV Zipped FITS ds9, Ximage, fv
IMAGE 2 Sky image 0.5 - 2.0 keV Zipped FITS ds9, Ximage, fv
IMAGE 3 Sky image 2.0 - 4.5 keV Zipped FITS ds9, Ximage, fv
IMAGE 4 Sky image 4.5 - 7.5 keV Zipped FITS ds9, Ximage, fv
IMAGE 5 Sky image 7.5 - 12.0 keV Zipped FITS ds9, Ximage, fv
EANCIL EXPMAP 8 Exposure map 0.2 - 12.0 keV Zipped FITS, PNG ds9, Ximage, fv, Netscape
EXPMAP 1 Exposure map 0.2 - 0.5 keV Zipped FITS ds9, Ximage, fv
EXPMAP 2 Exposure map 0.5 - 2.0 keV Zipped FITS ds9, Ximage, fv
EXPMAP 3 Exposure map 2.0 - 4.5 keV Zipped FITS ds9, Ximage, fv
EXPMAP 4 Exposure map 4.5 - 7.5 keV Zipped FITS ds9, Ximage, fv
EXPMAP 5 Exposure map 7.5 - 12.0 keV Zipped FITS ds9, Ximage, fv
EXSNMP Exposure sensitivity map Zipped FITS ds9, Ximage, fv
EEVLIS2 MIEVLI MOS imaging mode event list Zipped FITS xmmselect, fv, Xselect
PIEVLI PN imaging mode event list Zipped FITS xmmselect, fv, Xselect
TIEVLI PN, MOS timing mode event list Zipped FITS xmmselect, fv, Xselect
1
NNNNN Alphanumeric ID
2
Files for only those modes which were active will be included
1) gunzip the PP event list to be examined (not really necessary), and for practical purposes shorten the file
name as well, e.g.:
mv P0123700101M1S001MIEVLI0000.FTZ mos1.fits.gz
gunzip mos1.fits.gz
2a) In preparation set a few SAS parameters (directory pointers):
setenv SAS ODF /ODF
setenv SAS CCFPATH /CCF
setenv SAS CCF /PROC/ccf.cif
15
To verify the SAS-specific settings, use the command env | grep SAS, and remember that for SAS ODF
and SAS CCF it is best to use the full path
2b) If it doesnt already exist, create a CIF file using the SAS task cifbuild ( 4.5.1).
cifbuild fullpath=yes
2c) If it hasnt already been done (dont do it twice), prepare the ODF data by using the SAS task odfingest
(necessary for many SAS tasks) (see 4.5.2).
odfingest odfdir=$SAS ODF outdir=$SAS ODF
3) Invoke the SAS GUI (Figure 5.1).
sas &
Figure 5.1: The SAS GUI. To locate and invoke a task one need only start typing the task name, and when it
is high-lighted hit a carriage return. Otherwise, double-click on the task name.
4) Invoke the xmmselect GUI (Figure 5.2) from the SAS GUI. To invoke a task one need only start typing
the task name, and when it is high-lighted hit a carriage return.
When xmmselect is invoked a dialog box will first appear requesting a file name. One can either use
the browser button or just type the file name in the entry area, mos1.fits in this case. To use the
browser, first click on the file folder icon button on the right which will bring up a second GUI for
the file selection. Double click on the desired event file in the right-hand column (you may have to
open the appropriate directory first), click on the EVENTS extension in the right-hand column
(which selects the extension), and then click Ok. The directory GUI will then disappear and then
click Run on the selection GUI.
16
When the file name has been submitted the xmmselect GUI (Figure 5.2) GUI will appear, along with
a dialog box offering to display the selection expression. The selection expression will include the
filtering done to this point on the event file, which for the pipeline processing includes for the most
part CCD and GTI selections.
Figure 5.2: The xmmselect GUI. The top dialog area is for the selection expression. The central part of the
GUI provides a list of the parameters available in the table (note the scroll bar on the right hand side). Two-
dimensional data are selected using the square boxes on the left hand side (in this case X,Y, sky coordinates,
have been selected) while one-dimensional data are selected using the round boxes (Time in this example).
Figure 5.3: The evselect GUI. Additional parameters for the selected process can be accessed through the tabs
at the top of the GUI.
To create an image of the data in sky coordinates check the square boxes to the left of the X and
Y entries.
Click on the Image button near the bottom of the page. This brings up the evselect GUI (Fig-
ure 5.3).
The default settings are reasonable for a basic image so click on the Run button at the lower left
corner of the evselect GUI. Different binnings and other selections can be invoked by accessing the
Image tab at the top of the GUI.
The resultant image is written to the file image.ds, and the image is automatically displayed using
ds9, and is shown in Figure 5.4.
5b) Using the command line interface, create an image in sky coordinates by using the task evselect. The
same image produced in 5a) can be created using the following command.
18
evselect table=/PIPE/mos1.fits:EVENTS withimageset=yes imageset=image.fits
xcolumn=X ycolumn=Y imagebinning=imageSize ximagesize=600 yimagesize=600
> table input event table.
> withimageset make an image.
> imageset name of output image.
> xcolumn event column for X axis.
> ycolumn event column for Y axis.
> imagebinning form of binning, force entire image into a given size or bin by a specified number
of pixels.
> ximagesize output image pixels in X.
> ximagesize output image pixels in Y.
Display the output file image.fits using, e.g., ds9 image.fits &.
Figure 5.4: ds9 window showing the unfiltered image of the MOS1 data from the Lockman Hole SV1 observation,
displayed on a square root scale with an upper cut value of 40 using the SLS color look-up table.
19
5.2.3 Create and Display a Light Curve
6a) Create a light curve of the observation by using the xmmselect GUI (Figure 5.2).
To create a light curve check the round box to the left of the Time entry.
Click on the OGIP Rate Curve button near the bottom of the page. This brings up the evselect
GUI (Figure 5.3).
The default setting is for a one-second bin which is a bit fine, so access the Lightcurve tab and
change the timebinsize to, e.g., 100 (100 s). Click on the Run button at the lower left corner of
the evselect GUI.
The resultant light curve is written to the file rates.ds, and is displayed automatically using Grace
(Figure 5.5).
6b) Using the command line interface, create a light curve of the observation using the task evselect then
display with dsplot.
evselect table=/PIPE/mos1.fits:EVENTS withrateset=yes rateset=rate.fits
maketimecolumn=yes timecolumn=TIME timebinsize=100 makeratecolumn=yes
> table input event table.
> withrateset make an light curve.
> rateset name of output light curve file.
> maketimecolumn control to create a time column
> timecolumn time column label
> timebinsize time binning (seconds)
> makeratecolumn control to create a count rate column, otherwise a count column will be
created
dsplot table=rate.fits x=TIME y=RATE.ERROR withoffsetx=yes &
> table input event table
> x column for plotting on X axis
> y column for plotting on Y axis, the nomenclature RATE.ERROR plots the count rate column
(RATE) with the count-rate error column (ERROR) as uncertainties
> withoffsetx creates an offset to the X axis (-73194570.96472888 s in Figure 5.5)
Figure 5.5: Grace window showing the unfiltered light curve of the MOS1 data from the Lockman Hole SV1
observation. Also shown is the time selection interval.
Since MOS data are being used, in the selection expression area at the top of the xmmselect GUI
enter:
(PATTERN <= 12)&&(PI in [200:12000])&&#XMMEA EM.
Click on the Filtered Table box at the lower left of the xmmselect GUI.
Change the evselect filteredset parameter, the output file name, to something useful, e.g.,
mos1-filt.fits. Click Run.
7b) Filter the data using evselect on the command line.
evselect table=mos1.fits:EVENTS withfilteredset=yes
expression=(PATTERN <= 12)&&(PI in [200:12000])&&#XMMEA EM
filteredset=mos1-filt.fits filtertype=expression keepfilteroutput=yes
updateexposure=yes filterexposure=yes
> table input event table.
> filtertype method of filtering
> expression filtering expression.
> withfilteredset create a filtered set.
> filteredset output file name.
> keepfilteroutput save the filtered output
> updateexposure for use with temporal filtering
> filterexposure for use with temporal filtering
21
8) If necessary (and for the Lockman Hole SV1 data it most definitely is), add a temporal filtering clause to
the evselect selection expression. This is most often required because of soft proton flaring which can be
painfully obvious with count rates of 50100 counts a second, or more. Note that how much flaring needs
to be excluded depends on the science goals of the analysis, a whopping bright point source will clearly
be less affected than a faint extended object. A temporal filter can be easily created from the Grace light
curve plot window.
Create a light-curve plot through the xmmselect GUI
In the Grace window, pull down the Edit menu, select Regions, and select Define
For this case, select Left of Line for the Region type
Click the Define button and then click at two points to create a vertical line at the upper end of the
desired range on the Grace plot. (It is possible to define up to five regions at one time by changing
the Define region counter.)
Back on the xmmselect GUI, click on the 1D region button. This will transfer the selection criteria
to the Selection expression location.
The syntax for the time selection is (TIME <= 73227600). A more complicated expression which would
remove a small flare within an otherwise good interval (e.g., the soft proton flares observed in the light
curve plot of Figure 5.5) could be: (TIME <= 73227600)&&!(TIME IN [73221920:73223800]). The
syntax &&(TIME < 73227600) includes only events with times less than 73227600. Use &&!(TIME in
[73221920:73223800]) to exclude events in the time interval 73221920 to 73223800, the ! symbol
stands for the logical not. The full expression would then be:
(PATTERN <= 12)&&(PI in [200:12000])&&#XMMEA EM &&(TIME <= 73227600)
&&!(TIME in [73221920:73223800])
Again, give the new file a useful name (mos1-filt-time.fits) and make sure that the updateexposure
and filterexposure boxes are checked on the evselect GUI. Time filtering can also be done directly using
the light curve by the creation of a secondary GTI file using the routine tabgtigen task.
tabgtigen table=rate.fits:RATE expression=RATE<5
gtiset=gtisel.fits timecolumn=TIME
> table input count rate table and extension ( 5.2.3).
> expression filtering expression, in this case include those intervals where the count rate is < 5
counts s1 in the individual 100 s intervals.
> gtiset output file name for selected GTI intervals.
> timecolumn time column.
The output GTI table can then be used in the filtering expression in evselect with the syntax
&>I(gtisel.fits,TIME). The full expression would then be:
(PATTERN <= 12)&&(PI in [200:15000])&&#XMMEA EM&>I(gtisel.fits,TIME).
Figures 5.6 and 5.7 show the image and light curve generated from the filtered data.
1) With xmmselect running on the filtered file, create an image by selecting the small boxes to the left of
the X and Y columns, clicking on the Image button, and then clicking on the Run button on the
22
Figure 5.6: Filtered image of the MOS1 data from the Lockman Hole SV1 observation. Displayed with a square
root scale and an upper cut value of 20.
pop-up evselect GUI (for these purposes the default parameters are fine). To select a file name for the
image rather than using the default image.ds, select the Image page on the evselect GUI and change
the imageset entry.
2) On the ds9 window, create a region for a source of interest. Click once on the ds9 image and a region circle
will appear. Click on the region circle and the region will be activated, allowing the region to be moved
and its size to be changed. Having created, placed, and sized the region appropriate for the source, click
the 2D region button on the xmmselect GUI. This transfers the region information into the Selection
expression text area, e.g., ((X,Y) IN circle(26144,22838,600))
for the bright source at the lower center of the Lockman Hole observation. The circle parameters are
the X and Y positions and the radius of the circle in units of 0. 05, so the above region description is for
a circle of 30 radius.
Note: For serious spectral analysis the phrase &&(FLAG == 0) should be added to the selection expression.
This provides the most stringent screening of the data and will exclude events such as those next to the
edges of the CCDs and next to bad pixels which may have incorrect energies.
3) To extract the spectrum, first click the circular button next to the PI column on the xmmselect GUI.
23
Figure 5.7: Filtered light curve of the MOS1 data from the Lockman Hole SV1 observation.
Figure 5.8: Spectrum of a source from the Lockman Hole SV1 observation.
Next click the OGIP Spectrum button. Select the Spectrum page of the evselect GUI to set the
24
file name and binning parameters for the spectrum. For example, set spectrumset to source.ds. The
spectralbinsize must be set to 15 for the MOS or 5 for the PN. withspecranges must be checked,
specchannelmin set to 0, and specchannelmax set to 11999 for the MOS or 20479 for the PN. Figure 5.8
shows the spectrum.
4) To extract a background spectrum from an annulus surrounding the source, first clear the Selection
expression. Next repeat step 2) except create two circles defining the inner and outer edges of the
background annulus. Use the Properties menu under the ds9 Region menu to set the inner circle to
exclude. Then click the 2D region button on the xmmselect GUI to transfer the region description of
both circles to the Selection expression. This may need to be edited. For example, for an annulus it
should be as follows:
((X,Y) IN circle(26144,22838,1500))&&!((X,Y) IN circle(26144,22838,900)).
This will include data within a circle of radius 75 but not within a concentric circle of 45 (the values are
in units of 0. 05). Finally, repeat step 3) except set the filteredset parameter to a different file name,
e.g., back.ds.
5) To extract the source light curve, put the source Selection expression (the region descriptor used in Step
3) in place and click the circular button next to the TIME column on the xmmselect GUI. (Note: if you
forgot to record it, the region selection criteria can be found in the FITS header of the spectrum extension
of the spectrum file, e.g., source.ds.) Next click the OGIP Rate curve button. Select the Lightcurve
tab of the evselect GUI to set the file name and binning parameters for the light curve. For example, set
filteredset to source.rate and timebinsize to 1000 for a reasonable binning for the source examined
in the spectral analysis section. (NOTE: set timebinsize=1 and deselect makeratecolumn to create the
light curve for the temporal analysis example in 5.5. The first forces the time interval to be 1 s and the
second creates a count rather than a count rate column.)
6) Depending on how bright the source is and what modes the EPIC detectors are in, event pile up can
possibly be a problem. Pile up occurs when a source is so bright that there is the non-negligible possibility
that X-rays will strike two neighboring pixels or the same pixel in the CCD more than once in a read-out
cycle. In such cases the energies of the two events are in effect added together to form one event. If
this happens sufficiently often it will skew the spectrum to higher energies. To check whether pile up
may be a problem, use the SAS task epatplot. To run epatplot create source and background event files
by extracting data from the original event file using the time and region selection expressions combined
with the FLAG == 0 filtering (all PATTERN values are required). On the xmmselect GUI click the Filtered
Table button and check the updateexposure on the evselect General page and provide a filteredset
name, e.g., mos1-source.fits and mos1-back.fits, for the resultant files. Invoke epatplot from the
SAS GUI, enter the source event file name (e.g., mos1-source.fits) for the set parameter on Tab 0 and
set withbackgroundset to yes and provide the background event file name (e.g., mos1-back.fits) for the
backgroundset parameter on Tab 1, and click on Run. If the plot shows the model distributions
for single and double events diverging significantly from the measured distributions then pileup must be
considered. Figure 5.9 shows an example of a bright source (from a different observation) which is not
strongly affected by pileup. The source used in this example is too faint to provide reasonable statistics
for epatplot and is far from being affected by pile up.
7a) Create the photon redistribution matrix, the RMF, using the task rmfgen GUI.
From the SAS GUI, invoke the rmfgen GUI (Figure 5.10)
Set the spectrumset keyword to the spectrum file name, e.g., source.ds
Set the rmfset keyword to the RMF file name, e.g., rmf.ds
Click Run (if your xmmselect GUI is still running, a dialog box will occur asking whether rmfgen
can be run, it can as there is no conflict)
7b Create the photon RMF from the command line.
25
Figure 5.9: A MOS1 epatplot plot for a moderately bright source which does not show evidence for pileup. The
central source from the observation of G21.5-09 (0122700101) is used.
On the main tab set the arfset keyword to the ARF file name, e.g., arf.ds
On the effects tab set the badpixlocation keyword to the event file name from which the spectrum
was extracted, e.g., mos1-filt-time.fits
On the calibration tab check the withrmfset box and set the rmfset keyword to the RMF file
name, e.g., rmf.ds
Click Run (if your xmmselect GUI is still running, a dialog box will occur asking whether rmfgen
can be run, it can as there is no conflict)
8b) Create the ARF from the command line.
arfgen arfset=arf.ds spectrumset=source.ds withrmfset=yes rmfset=rmf.ds
badpixlocation=mos1-filt-time.fits
1) Create an attitude file using atthkgen, this is required for the creation of the exposure maps. Note that
the file *ATTTSR* is the attitude file created by the pipeline processing and can also be used. Both the
atthkgen GUI and command line are easy to use.
atthkgen atthkset=attitude.fits timestep=1
> atthkset output file name
> timestep time step in seconds for attitude file
2) Create images in sky coordinates over the PI channel ranges of interest using the task evselect (the GUI
can be used as well). It will use the filtered event list mos1-filt.fits produced above. In this example
evselect is run six times to create the images in two bands (300 - 2000 eV, and 2000 - 10000 eV) for each
of the three detectors.
28
Figure 5.13 shows the output of implot for the maximum likelihood source detection (emldetect).
30
Figure 5.13: EPIC count image with the detected sources from the maximum likelihood task created by implot.
1) Use the Xronos command lcurve to produce a binned lightcurve. The following command will also produce
a screen plot using QDP (quit or exit will exit the QDP session).
lcurve nser=1 cfile1=source.lc window=- dtnb=500 nbint=450
outfile=lightcurve.fits plot=yes plotdev=/xw
> nser number of time series
> cfile1 filename first series
> window name of window file (if a subset of the time series is required)
> dtnb bin size (time)
> nbint number of bins per interval
> outfile output file name (FITS format light curve)
> plot plot flag
> plotdev device for plotting, output shown in Figure 5.14
2) Use the Xronos command powspec calculate power spectrum density. The following command will also
produce a screen plot using QDP (quit or exit will exit the QDP session).
powspec cfile1=source.lc window=- dtnb=100.0 nbint=300 nintfm=INDEF rebin=5
plot=yes plotdev=/xw outfile=power.fits
> cfile1 filename first series
31
Figure 5.15: Power spectrum density for the source analyzed in 5.3.
> window name of window file (if a subset of the time series is required)
> dtnb bin size (time)
> nbint number of bins per interval
> nintfm number of intervals in each power spectrum
> rebin rebin factor for power spectrum (0 for no rebinning)
> plot plot flag
32
> plotdev device for plotting, output shown in Figure 5.15
> outfile output file name (FITS format power spectrum)
3) Use the Xronos command efsearch to search for periodicities in the time series. The following command
will also produce a screen plot using QDP (quit or exit will exit the QDP session).
efsearch cfile1=source.lc window=- sepoch=INDEF dper=20 nphase=10 nbint=INDEF
nper=100 dres=INDEF plot=yes plotdev=/xw outfile=efsearch.fits
> cfile1 filename first series
> window name of window file (if a subset of the time series is required)
> sepoch value for epoch used for phase zero when folding the time series
> dper value for the period used in the folding
> nphase number of phases per period
> nbint number of bins per interval
> nper number of sampled periods during search
> dres sampling resolution of search
> plot plot flag
> plotdev device for plotting
> outfile output file name (FITS format)
4) Use the Xronos command autocor to calculate the auto correlation for an input time series. The following
command will also produce a screen plot using QDP (quit or exit will exit the QDP session).
autocor cfile1=source.lc window=- dtnb=24.0 nbint=2048 nintfm=INDEF rebin=0
plot=yes plotdev=/xw outfile=auto.fits
> cfile1 filename first series
> window name of window file (if a subset of the time series is required)
> dtnb bin size (time)
> nbint number of bins per interval
> nintfm number of intervals to be summed in each autocorrelation function
> rebin rebin factor for autocorrelation function (0 for no rebinning)
> plot plot flag
> plotdev device for plotting
> outfile output file name (FITS format autocorrelation spectrum)
5) Use the Xronos command lcstats to calculate statistical quantities for an input time series. The following
command will write the output to an ASCII file. (Leave off the > fname to write the results to the screen.)
lcstats cfile1=source.lc window=- dtnb=6.0 nbint=8192 > fname
> cfile1 filename first series
> window name of window file
> dtnb integration time (binning)
> nbint number of bins
> fname output file name
From this point follow the procedures in 5.3.3 and 5.3.4 for spectral analysis and 5.5 for temporal
analysis.
35
5.7 ODF DATA
The ODF names for the EPIC data will look something like:
Data ID Contents
1) If necessary, rename all files in the ODF directory to upper case. This can be done using the script
provided by the NASA/GSFC XMM-Newton GOF.
2) Initialize SAS (see 4).
3) Create a CIF file using the SAS task cifbuild ( 4.5.1). If a CIF file has previously been produced, it is
only necessary to rerun cifbuild if the CCF has changed.
36
4) Run the SAS task odfingest ( 4.5.2). It is only necessary to run it once on any data set (and will cause
problems if it is run a second time). If for some reason odfingest must be rerun, first delete the earlier
*.SAS (the file produced by odfingest).
5) Run the SAS task emchain. From the command line of a window where SAS has been initialized, simply
enter:
emchain
emchain processes the data from both MOS instruments producing calibrated photon event files. If the
data set has more than one exposure, a specific exposure can be accessed using the exposure parameter,
e.g.:
emchain exposure=n
where n is the exposure number.
6) Run the SAS task epchain, which processes the data from PN instrument producing a calibrated photon
event file. From the command line of a window where SAS has been initialized, simply enter:
epchain
To create an out-of-time event file, use the command:
epchain withoutoftime=yes
Adding the parameter keepintermediate=none causes epchain to discard a number of intermediate files.
Once the chains have completed with new event files the same analysis techniques described in the previous
sections can used.
Before beginning this chapter please consult the watchout page at the VILSPA SOC:
http://xmm.vilspa.esa.es/sas/documentation/watchout
This web site discusses current and past SAS bugs and analysis issues, e.g., regarding missing libraries when
using rgsproc with SAS V6.
The INDEX.HTML file will help you navigate. The data file names are of the form:
PiiiiiijjkkaablllCCCCCCnmmm.zzz, where
iiiiii proposal number
jj observation ID - target number in proposal
kk observation ID - observation number for target
aa detector, R1 RGS1, R2 RGS2
b S for scheduled observation, U for unscheduled
lll exposure number
CCCCCC file identification (Table 6.1)
n spectral order number, unimportant otherwise
mmm source number
zzz file type (e.g., PDF, PNG, FTZ, HTM)
38
39
RIMAGE ORDIMG Images, disp. vs. X-disp Zipped FITS, PNG ds9, Ximage, fv, Netscape
IMAGE Images, disp. vs. PI Zipped FITS, PNG ds9, Ximage, fv, Netscape
xspec
XSPEC>data PiiiiiijjkkaablllSRSPEC1mmm.FTZ
XSPEC>back PiiiiiijjkkaablllBGSPEC1mmm.FTZ
XSPEC>resp RGS1 ORDER1.RSP
XSPEC>ignore bad
XSPEC>model wabs*mekal
wabs:nH>1
mekal:kT>1
mekal:nH>
mekal:Abundanc>0.4
mekal:Redshift>
mekal:Switch>0
mekal:norm>1
XSPEC>renorm
XSPEC>fit
XSPEC>setplot device /xs
XSPEC>setplot wave
XSPEC>setplot command window all
XSPEC>setplot command log x off
XSPEC>setplot command wind 1
XSPEC>setplot command r y 1e-5 1.6
XSPEC>setplot command wind 2
XSPEC>setplot command r y -9.99 9.99
XSPEC>plot data residuals
XSPEC>exit
Do you really want to exit? (y)y
Figure 6.1: 1st order RGS1 spectrum of AB Dor. The fit is an absorbed single-temperature mekal model. The
gap between 1015A is due to the absence of CCD7.
Figure 6.2: Background event rate from the RGS1 CCD9 chip. The flares are solar events. The time units are
elapsed mission time.
R1_RATE.FIT
RATE
1
0.8
RATE [count/s]
0.6
0.4
0.2
0
9.639e+07 9.64e+07 9.641e+07 9.642e+07 9.643e+07 9.644e+07 9.645e+07
TIME [s]
ATTTSR FIT FITS table attitude information for the complete observation.
attgti FIT FITS table good time intervals from the attitude history.
hkgti FIT FITS table good time intervals from the housekeeping files.
SRCLI FIT FITS table list of sources and extraction masks.
merged FIT FITS table event list merged from all CCDs.
EVENLI FIT FITS table merged and filtered event list.
EXPMAP FIT FITS image exposure map.
SRSPEC FIT FITS table source spectrum.
BGSPEC FIT FITS table background spectrum.
matrix FIT FITS table response matrix.
fluxed FIT FITS table fluxed spectrum. For quick and dirty inspection only.
gunzip platform-htrframes.tar.gz
tar -xvf platform-htrframes.tar
bkgcorrect=no will yield a source spectrum with background events included. The background level will be
automatically subtracted if bkgcorrect=yes. Unless the spectra are of high signal-to-noise, it is recommended
that scientific analysis only be carried out on those spectra where bkgcorrect=no. However, note that the fluxed
spectrum (which is only suitable for initial data inspection) is best examined at after declaring bkgcorrect=yes.
New files are written to the working directory. Table 6.2 lists these, and all are uncompressed FITS files. The
filenames are of the same form given in Section 6.1.1:
Even if no solar flares occurred during the observation, it is recommended that the pipeline is re-run in order to
take advantage of the most up-to-date calibration and ensure that region filters more appropriate for the source
are created.
Figure 6.3: Images over the dispersioncross-dispersion plane (top) and the dispersionpulse height plane (bot-
tom). The lower and upper bananas are 1st and 2nd order events respectively. The blue lines define the source
extraction regions, one spatial and the other over PI. Horizontal blue lines delineate the internal calibration
sources. The regular chevron background pattern in the right hand CCDs (1 and 2) are a manifestation of
electronic cross-talk. These events have low PI values and are filtered out by the PI masks.
The resulting curve is provided in Figure 6.4. Note that unlike Figure 6.2 these events have been extracted
across the whole detector and that our Good Time constraint has been adhered to.
3) To plot a spectrum with an approximate wavelength scale, use the mlambda table column rather than a
response matrix. One important caveat here is that all orders are superimposed on this table:
46
Figure 6.4: Total event rate from RGS1 after Good Time filtering.
P0134520301R1S001RATES_0000.FIT
RATE
6
4
RATE [count/s]
0
9.639e+07 9.64e+07 9.641e+07 9.642e+07 9.643e+07 9.644e+07 9.645e+07
TIME [s]
Figure 6.5: RGS1 spectrum binned on the approximate wavelength scale provided in the M LAMBDA column.
the gap between 10 and 15A is the missing chip CCD7. CCD4 is similarly missing in the RGS2 camera. Both
failed after space operations began.
P0134520301R1S001QKSPEC0000.FIT
HISTO
600
500
400
COUNTS [count]
300
200
100
0
0 10 20 30 40
M_LAMBDA [Angstrom]
Since this operation will alter only the size of the regions in the sources file, it saves time to not re-make the
event table or re-calculate the exposure map. The pipeline can be entered at five different points. In this case
one only need start from the spectral extraction stage:
Note that this last example will only work if one has retained the event file from a previous re-running of the
pipeline.
These coordinates are written to the RGS source list PiiiiiijjkkaablllEMSRLInmmm.FIT with a source ID
which, in this example, will be 3. Creating the source file is one of the first tasks of the pipeline. If these
new coordinates correspond to the prime source then the entire pipeline must be run again in order to calculate
the correct aspect drift corrections in the dispersion direction. However, if these new coordinates refer to a
background source that should be ignored during background extraction, then the majority of pipeline processing
(drift correction, filtering etc) will remain identical to the previous examples. To save processing time one can
create a new source list by hand and then enter the pipeline at a later stage.
events Creates attitude time series, attitude-drift and housekeeping GTI tables, pulse height offsets,
the source list, and unfiltered, combined event lists.
angles Corrects event coordinates for aspect drift and establishes the dispersion and cross-dispersion
coordinates.
filter produces filtered event lists, creates exposure maps.
spectra Constructs extraction regions and source and background spectra.
fluxing creates fluxed spectra for quick data inspection and response matrices.
Provided the filtered event list is retained, users can apply their own filtering by entering the pipeline at the
filter stage.
Changes in the extraction region sizes can be handled by entering at the spectra stage.
If the coordinates of the source differ from those in the original proposal, the pipeline must be run from events.
Extraction of spectra with different binning can be achieved at the spectra stage.
Recalculation of the response matrices can be done in the final fluxing stage.
xspec
XSPEC>data 1:1 PiiiiiijjkkaablllSRSPEC1mmm.FIT 1:2 PiiiiiijjkkaablllSRSPEC2mmm.FIT
XSPEC>ignore bad
XSPEC>model phabs*mekal
etc...
50
6.6 APPROACHES TO SPECTRAL FITTING
For data sets of high signal-to-noise and low background, where counting statistics are within the Gaussian
regime, the data products above are suitable for analysis using the default fitting scheme in XSPEC, 2 -
minimization.
However for low count rates, in the Poisson regime, 2 -minimization is no longer suitable. With low count rates
in individual channels, the error per channel can dominate over the count rate. Since channels are weighted
by the inverse-square of the errors during 2 model fitting, channels with the lowest count rates are given
overly-large weights in the Poisson regime. Spectral continua are consequently often fit incorrectly the model
lying underneath the true continuum level.
This will be a common problem with most RGS sources. Even if count rates are large, much of the flux from
these sources can be contained within emission lines, rather than continuum. Consequently even obtaining
correct equivalent widths for such sources is non-trivial. There are two approaches to fitting low signal-to-noise
RGS data, and the correct approach would normally be to use an optimization of the two.
grppha
> Please enter PHA filename[] PiiiiiijjkkaablllSRSPEC1mmm.FIT
> Please enter output filename[] !PiiiiiijjkkaablllSRSPEC1mmm.FIT
> GRPPHA[] group min 30
> GRPPHA[] exit
The disadvantage of using grppha is that, although channel errors are propagated through the binning
process correctly, the errors column in the original spectrum product is not strictly accurate. The problem
arises because there is no good way to treat the errors within channels containing no counts. To allow statistical
fitting, these channels are arbitrarily given an error value of unity, which is subsequently propagated through
the binning. Consequently the errors are over-estimated in the resulting spectra.
An alternative approach is to bin the data during spectral extraction. The easiest way to do this is call the
RGS pipeline after the pipeline is complete. The following rebins the pipeline spectrum by a factor 3:
One disadvantage of this approach is that one can only choose integer binning of the original channel size. To
change the sampling of the events the pipeline must be run from angles or earlier:
The disadvantage of using rgsproc, as opposed to grppha, is that the binning is linear across the dispersion
direction. Velocity resolution is lost in the lines; e.g., the accuracy of redshift determinations will be degraded,
transition edges will be smoothed and neighboring lines will become blended.
51
6.6.2 Maximum-Likelihood Statistics
The second alternative is to replace the 2 -minimization scheme with the Cash maximum-likelihood scheme
when fitting data. This method is much better suited to data with low count rates and is a suitable option
only if one is running XSPEC v11.1.0 or later. The reason for this is that RGS spectrum files have prompted a
slight modification to the OGIP standard. Because the RGS spatial extraction mask has a spatial-width which
is a varying function of wavelength, it has become necessary to characterize the BACKSCL and AREASCL
parameters as vectors (i.e., one number for each wavelength channel), rather than scalar keywords as they are
for data from the EPIC cameras and past missions. These quantities map the size of the source extraction
region to the size of the background extraction region and are essential for accurate fits. Only XSPEC v11.1.0,
or later versions, are capable of reading these vectors, so ensure that one has an up-to-date installation at your
site.
One caveat of using the cstat option is that the scheme requires a total and background spectrum to be
loaded into XSPEC. This is in order to calculate parameter errors correctly. Consequently, be sure not to use
the net spectra that were created as part of product packages by SAS v5.2 or earlier. To change schemes in
XSPEC before fitting the data, type:
XSPEC>statistic cstat
Observing extended sources effectively broadens the psf of the spectrum in the dispersion direction. Conse-
quently it is prudent to also increase the width of the PI masks using the pdistincl parameter in order to
prevent event losses.
1. An OGIP FITS image of the source. The better the resolution of the image, the more accurate the
convolution. For example, if a Chandra image of the source is available, this will provide a more accurate
result than an EPIC image.
2. An ASCII file called, e.g. xsource.mod, containing three lines of input. It defines three environment
variables and should look like this example:
52
RGS XSOURCE IMAGE ./MOS1.fit
RGS XSOURCE BORESIGHT 23:25:19.8 -12:07:25 247.302646
RGS XSOURCE EXTRACTION 2.5
Figure 6.6: The top figure is a thin, thermal plasma at 2 keV from a point source. The lower figure is the same
spectral model, but convolved by the MOS1 0.32.0 keV spatial profile of a low-redshift cluster.
The lines of the script for setting up and running SAS are specific to installation at GSFC and so will need to
be modified as appropriate. The script uses the SAS command-line interface and goes through the following
steps:
1) Copies the raw and pipelined data from the XMM archive.
2) Initializes SAS.
3) Creates a Current Calibration file.
4) Builds an ODF summary file.
5) Constructs a GTI file based on background activity.
6) Runs the RGS pipeline.
7) Makes a few useful data inspection products.
8) Fits a model to one of the resulting spectra.
Chapter 7
The OM is somewhat different from the other instruments on-board XMM-Newton, and not only because it
is not an X-ray instrument. Since the OM pipeline products can be used directly for most science analysis
tasks, a re-processing of the data is not needed in most cases. So in principle one can ignore the files in the
ODF directory and go directly to 7.1, which describes the files in the PPS (or PIPEPROD) directory. Users
interested in re-processing of the OM data can go directly to 7.2 which explains the pipeline processing. For
the analysis of OM data obtained in FAST or GRISM mode, however, a re-processing of data is needed, which
is explained in more detail in 7.2.2 and 7.2.3.
54
55
Table 7.2: Some of the important columns in the SWSRLI FITS file.
Figure 7.1: Merged OM image of the Lockman Hole SV1 observation obtained with the V filter. The image is
displayed in logarithmic scale with an upper cut value of 20,000.
DEC
57:40
57:35
57:30
57:25
57:20
resolution. The task ommosaic be also be used to combine images observed with different filters. Note that the
final image is not corrected for coincidence losses or for deadtime.
Throughout the OM section of this ABC Guide, public data from the Lockman Hole SV1 observation
(OBS-ID 0123700101) have been used to illustrate the SAS tasks. We suggest that the user download these
data and to retrace the following procedures. Figure 7.1 shows the merged V-band image from the Lockman
Hole SV1 observation using the ommosaic task.
You can also use a program written at the NASA/GSFC XMM-Newton GOF. The task is meant to be
used on files in the PPS directory (which contains the outputs of the OM pipeline). It produces a final event
and exposure images in sky coordinates for each of the filters used in the observation. Low and high-resolution
images are treated separately. The task requires that FTOOLS and Perl are installed on your machine and
the script must be run from a writable directory which contains the OM files. The tar file with the script is
available at the GOF site: ftp://legacy.gsfc.nasa.gov/xmm/software/om tools/om prod all.tar.gz.
The program is fairly easy to use, and to modify. If any problems arise with the task please contact the GOF.
The IMAGING, FAST, and GRISM chains (omichain, omfchain, and omgchain) are described below. We
have also written an equivalent to the omichain which allows one to vary the input parameters of each task, or
to run the pipeline on only a subset of the data.
ODF Products
In IMAGING mode, OM files in the ODF directory looks like:
0070_0123700101_OMS00400IMI.FIT 0070_0123700101_OMS42200RFX.FIT
0070_0123700101_OMS00400RFX.FIT 0070_0123700101_OMS42200THX.FIT
0070_0123700101_OMS00400THX.FIT 0070_0123700101_OMS42200WDX.FIT
0070_0123700101_OMS00400WDX.FIT 0070_0123700101_OMS42201IMI.FIT
0070_0123700101_OMS00401IMI.FIT 0070_0123700101_OMS42300IMI.FIT
0070_0123700101_OMS00500IMI.FIT 0070_0123700101_OMS42300RFX.FIT
0070_0123700101_OMS00500RFX.FIT 0070_0123700101_OMS42300THX.FIT
...
For each exposure there are: an image file (IMI), a tracking history file (THX), and a window data
auxiliary file (WDX). There is one non-periodic (NPH) and periodic (PEH) housekeeping file per observa-
tion. In order to run a task, you will also need three files that are not specific to the OM. The first one
(0070 0123700101 SCX00000SUM.SAS) is an ASCII file containing a summary of the observation, which is cre-
ated by the odfingest task ( 4.5.2):
Please note that the keyword PATH can be edited to match your current location of the data.
The second general file (0070 0123700101 SCX00000TCS.FIT) is the spacecraft time correlation file, while
the third (0070 0123700101 SCX00000ATS.FIT) contains the spacecraft attitude file.
Re-processing of Imaging data can be done automatically by using omichain. The task omichain runs on
filters specified by the user. If no arguments are given, the chain runs on all the files present in the $SAS ODF
directory. If the omichain tasks are re-run one by one, there may be small differences between the files obtained
in this manner and the pipeline products in the PPS directory. The main reasons for the differences are
improvements made to the SAS software, the type of products produced by the pipeline (for example, only the
most recent products have a final image for each filter), and some changes in the calibration products.
The following explains the step-by-step processing of OM files. At the end of this section, a script
is provided which goes automatically through all of the steps described below. The script is essentially an
annotated version of omichain and shows what the processing does. We suggest that the user goes through all
of the steps at least once manually before using the script.
The FILTER keyword in the initial ODF file is a number between 0 and 2100. The correspondence between
number and filter value is given in Table 7.3.
File ID Filter
1200 blocked
1400 V
1600 Magnifier
1800 U (no bar)
2000 B
0000 White (datum)
0200 Grism 2 (Optical)
0400 UVW1
0600 UVM2
0800 UVW2
1000 Grism 1 (UV)
2100 Bar
59
We have written a script which goes through the complete list of files and gives back the filter used for
each exposure. The script is available at:
ftp://legacy.gsfc.nasa.gov/xmm/software/om tools/file examine.shell
Running this script provides a list of files and their associated filters. The details of the association are
less complicated than it may appear at first. In the standard configuration (the so-called Rudi-5 mode) one gets
exposures in groups of 5, in high- and low-resolution mode, for a total of 10 files per filter. The high-resolution
mode covers the same small central window in all five exposures while the low-resolution mode covers different
parts of the detector in each of the 5 exposures. The sum of the low-resolution exposures covers the entire FOV.
In general, the number following OMS will either be of the form 00400, 00401, 00500... or 40100, 40101,
40200,.. The last two digits indicate the resolution. 00 is high-resolution and 01 is low-resolution. In
this example, the high-resolution window will be called 0070 0123700101 OMS00400IMI.FIT.gz while the low-
resolution window will be 0070 0123700101 OMS00401IMI.FIT.gz. The low-resolution images for each of the
five frames are taken consecutively to obtain the full FOV. For each low-resolution frame there is a high-resolution
frame of the inner part of the detector. Here is an example of what running the script, file examine.shell, pro-
duces:
/XMM/Mydata/ODF: ./file_examine.shell
0070_0123700101_OMS00400IMI.FIT FILTER V
0070_0123700101_OMS00401IMI.FIT FILTER V
0070_0123700101_OMS00500IMI.FIT FILTER U
0070_0123700101_OMS00501IMI.FIT FILTER U
0070_0123700101_OMS00600IMI.FIT FILTER WHITE
0070_0123700101_OMS00601IMI.FIT FILTER WHITE
0070_0123700101_OMS41500IMI.FIT FILTER V
0070_0123700101_OMS41501IMI.FIT FILTER V
0070_0123700101_OMS41600IMI.FIT FILTER V
0070_0123700101_OMS41601IMI.FIT FILTER V
0070_0123700101_OMS41700IMI.FIT FILTER V
0070_0123700101_OMS41701IMI.FIT FILTER V
0070_0123700101_OMS41800IMI.FIT FILTER V
0070_0123700101_OMS41801IMI.FIT FILTER V
0070_0123700101_OMS41900IMI.FIT FILTER U
0070_0123700101_OMS41901IMI.FIT FILTER U
As noted above, the last three digits are paired so that in general one gets a low- and a high-resolution
image. In this example the following images for the V filter are produced: 00401/00400 (inner part of the low-
resolution image plus high-resolution frame of the inner part) 41501/41500 (left-hand frame of the low-resolution
image plus high-resolution frame of the inner part), 41601/41600 (bottom frame of the low-resolution image plus
high-resolution frame of the inner part), 41701/41700 (right-hand frame of the low-resolution image plus high-
resolution frame of the inner part), and 41801/41800 (top frame of the low-resolution image plus high-resolution
frame of the inner part). This means that usually five high-resolution images are produced which can be co-added
to achieve deeper exposures. Please be aware that one should NOT add low-resolution and high resolution
images, even if they cover the same part of the FOV (e.g., one can not add 0070 0123700101 OMS00401IMI.FIT
and 0070 0123700101 OMS00400IMI.FIT).
Once one has decided which data to process (for example one exposure of one filter taken with a certain
resolution), one should make sure that
1) A Calibration Index File has been created using cifbuild ( 4.5.1).
2) A summary file of the ODF constituents has been created using odfingest ( 4.5.2).
3) One set of exposures has been chosen on which to run omprep.
As an example, we use the first high-resolution exposure for the Lockman Hole SV1 data and have copied
them into the /XMM/Mydata directory. The files associated with the exposure are:
60
/XMM/Mydata: ls
0070_0123700101_OMS00400IMI.FIT 0070_0123700101_OMX00000PEH.FIT
0070_0123700101_OMS00400THX.FIT 0070_0123700101_SCX00000ATS.FIT*
0070_0123700101_OMS00400WDX.FIT 0070_0123700101_SCX00000SUM.ASC*
0070_0123700101_OMX00000NPH.FIT 0070_0123700101_SCX00000SUM.SAS
The file 0070 0123700101 SCX00000SUM.SAS has been edited to point to that directory, SAS ODF is also
pointing to this directory, and SAS CCF points to the file ccf.cif generated by cifbuild ( 4.5.1).
Note: omdetect does a variable job with the stray-light features and it may sometimes be fooled by them.
One way to separate them from real detections is to look at the FWHM max and min parameter in the
source list. Spurious source detections associated with stray-light features will have large value associated
with these parameters.
Note: There is a recipe to convert the UV count rates to flux. The recipe was provided by Alice
Breeveld (MSSL) and can be accessed at:
http://xmm.vilspa.esa.es/sas/documentation/watchout/uvflux.shtml
Note: Due to the large size of the catalog, it is not distributed. Users, however, can provide their own
catalog if they wish. The format is that used for the USNO cross-correlation FITS products. In general,
the usecat keyword should be set to no.
Note: The pointing stability about the spacecraft boresight position is better than 1 (look at the tracking
plots derived at the beginning). There is still a scatter of about 4 between the planned and actual pointing
position.
There is a script which does all this step by step and allows one to run the pipeline only on the desired
file. The script is available at:
ftp://legacy.gsfc.nasa.gov/xmm/software/om tools/omproc gof.
Please contact the GOF if you have any problems with it.
64
7.2.2 Fast Mode
SAS has a working fast mode pipeline. If the data have not been processed by the latest version of SAS, the
task omfchain should be run.
The chain works similarly to the imaging chain explained above, and consists of a Perl script which calls all
the necessary tasks sequentially. It produces images of the detected sources, extracts events related to the sources
and the background, and extracts the corresponding light curves. A more detailed description of the chain can
be found in the SAS on-line help available at http://xmm.vilspa.esa.es/sas/current/doc/index.html. You
can also access the general description of the task at: ftp://legacy.gsfc.nasa.gov/xmm/doc/fastmode.ps.gz.
A summary of the task is shown in Figure 7.2.
The sequence of tasks used by omgchain is illustrated in Fig. 7.3. An output spectrum produced by
omgchain is given in Fig. 7.4. Each of these tasks can be run individually. SAS V6 also includes a new
interactive task, omgsource, which allows the user to select with the cursor the spectrum to be extracted.
65
Figure 7.3: Diagram of the different tasks used by omgchain. The first four tasks are preparatory, the other
three tasks execute the source detection and spectral identification procedure, and produce the output files.
The task omgchain has many parameters, but none of them are mandatory. Below is a description of the
calling sequence and the individual parameters.
Note: if a source is not detected by omdetect, or does not fall within the grism window, omgchain will
run without warning, but will not produce output files.
Figure 7.4: OM optical grism spectrum obtained from a 4.7 ks observation of Mrk 478.