Pipeline DR software is provided at the telescope to allow observers to assess the quality of their data in real time. The data reduction is actually quite sophisticated and may, in many cases, yield publishable results! Data reduction “Recipes” are provided to deal with the different methods of observing in the near-IR. If you do not find one specific to your needs, then please contact your support scientist – given sufficient notice, we may be able to provide a new recipe for your observing run.
Many of the available DR recipes, or methods of observing/reducing the data, expect a flat and an arc spectrum. Indeed, the pipeline software may stop if a flat or an arc hasn’t been observed with the same instrument setup just prior to the target observations. Likewise, many recipes insist on a standard star observation before a target observation, since this will always be required for proper data reduction, be it pipeline DR or offline reduction with Starlink or IRAF software. Note, however, that recipe versions are available which do not require a flat, an arc or a standard; these have the suffix _NOFLAT, _NOARC and _NOSTD (and _NOFLAT_NOSTD!). A full list of available recipes is given here.
To flat-field or not to flat-field…
This may indeed be the question… Strictly speaking, the pixel-to-pixel response of the array, and the wavelength-dependent transmission of the telescope and UIST optics, will be corrected for when an extracted target spectrum is divided by an extracted standard star spectrum, provided that the two stars are observed on exactly the same rows of the array. However, since this is usually not the case, it is always prudent to flat-field all data with an internal black-body lamp image. The pipeline DR will do this (unless _NOFLAT is specified in the recipe); the additional noise introduced by the flat-field division should be minimal, given the bright lamp signal. The DR recipe “REDUCE_FLAT” normalises the flat-field image by fitting a blackbody function to all rows in the dispersion direction (assuming a specific temperature for the BB lamp). Any remaining deviation in the normalised BB “image” will be due to instrumental transmission effects.
An arc, an arc…
My kingdom for an arc! An argon lamp is currently being used for wavelength calibration. Note, however, that lamp spectra may not be necessary for moderate-resolution spectroscopy, particularly in the H and K bands, where there are many atmospheric OH lines. If you choose not to observe arc spectra, make sure you check that suitably bright sky lines are available for calibration (a dark frame may also be useful for “subtracting” bad pixels off of the raw calibration frame). However, given the time needed to take an arc, again, skipping this calibration is NOT recommended
Running the pipeline
The template observing sequences available in the UKIRT-OT contain recipes appropriate to the associated observing mode. You should not normally need to change the recipe; if you do be careful as many have specific requirements in terms of flat fields and standards, which must be acquired before a target observation is obtained and reduced on-line. Tables of available DR recipes – and links to detailed descriptions of them – are available.
1. To run ORAC-DR at the telescope type the following:
oracdr_uist oracdr -loop flag
2. To run ORAC-DR at your home institute type the following:
oracdr_uist 20071225 setenv ORAC_DATA_IN /home/cdavis/my/raw/data/ setenv ORAC_DATA_OUT /home/cdavis/my/reduced/data/ oracdr -list 1:100 &
When running ORAC-DR at your home institution, the pipeline needs to know when the data were taken (since files are labelled with the UT data), where the raw data are, and where reduced data are to be written. Obviously these directories need to exist on your machine, and the raw data need to be in the specified directory. In the above example, frames 1 to 100 from 20071225 will be reduced.
Note that the “array test” observations taken at the start of the night MUST be reduced BEFORE you try and reduce a block of your own data. Use the -list option to specify the data range. For example, if frames 1-19 were darks taken to measure the readnoise and set up a bad pixel mask (in the log the DR recipe will be set to MEASURE_READNOISE and DARK_AND_BPM), and frames 38,39,40-43 were the flat, arc and ABBA-style “quad” of observations on your standard star, you could type the following:
oracdr -list 1:19,38:43 &
Having reduced the array tests and flat/arc/standard star calibrations, you can then reduce your target observations, e.g.
oracdr -from 44 &
When running ORAC-DR, several windows will open as they are needed; an ORAC text display, GAIA windows and kapview 1-D spectral plotting windows. If you are at the telescope the pipeline will reduce the data as they are stored to disk, using the recipe name in the image header.
The DR recipe name stored in each file header will be used by the pipeline. However, this can be overridden if, for example, you decide you do not want to ratio the target spectra by a standard star, e.g.:
oracdr POINT_SOURCE_NOSTD -list 31:38
where 31 to 38 were the eight observations of the target.
Help on this and other ORAC-DR topics is available by typing
To exit (or abort) ORACDR click on EXIT in the text log window, or type ctrl-c in the xterm. The command oracdr_nuke can be used to kill all DR-related processes, should you be having problems.
Extracted spectra may also be examined using the splat. In the reduced data directory (cd $ORAC_DATA_OUT) type “splat &”.
Finally, note that there are tips on how to reduce and analyse Spectroscopy data in this Starlink Cookbook
Read Noise and Variance
Finally, if you re-reduce your data on another machine (down in Hilo or back at your home institute) using ORAC-DR you will probably have to reduce the READ_NOISE dark exposures taken at the start of the night first. This sequence of array characterisation observations gives a measure of the readnoise as a general health check at the start of each night of UIST observations. However, this information is also used by the DR to estimate whether the data is background-limited or not, and to assign to the data and maintain a variance array component at each stage in the reduction.
Briefly, the variance mapped across the array is derived from the readnoise variance (usually about 40e- or about 3 counts) and the Poisson variance (based on the signal from the sky and source – so the variance will be higher along the spectrum of a bright star); this then gets propagated through the DR when images are subtracted/co-added/divided-by-standard, etc. The variance is used to weight the data during optimal extraction of source spectra (Figaro’s optextract routine) so that a cleaner spectrum can be obtained.
So, each data file will consist of two components, the actual data (counts from the source and sky) and the variance calculated for each pixel across the array. You can create an image of the variance across a spectral image using the starlink-kappa commands:
> creobj type=ndf dims=0 object=gu20030101_999_var > copobj gu20030101_999.variance gu20030101_999_var.data_array > gaiadisp gu20030101_999_var
Step 1 creates a blank image called gu20030101_999_var.sdf, step 2 copies the variance information from the reduced group image gu20030101_999.sdf to the data array in gu20030101_999_var.sdf, and step three displays gu20030101_999_var.sdf in the gaia window.
Note for IRAF Users: If your data have been converted to fits using the convert package, then the two array components in each NDF (.sdf) will be converted to FITS extensions. Because pipeline-reduced spectral images contain both the data and the variance array, it will be necessary to read in each data file by specifying the data array specifically.
In other words, if you get an error like ERROR: FXF: must specify which FITS extension , try
> imhead u20020101_00999_wce.fit or > imhead u20020101_00999_wce.fit
The former will be the data, the latter the variance.