HET Data Reduction Tips


Below are data reduction tips. This page does not tell you how to reduce your data; it just gives the general data reduction procedure.

Choose an instrument:

LRS Data Reduction Tips

Fixing pixels

There is a bad pixel mask available on rhea in the calibrations/LRS directory. This pixel mask was created by Joe Tufts and was based on a 1x1 binned images taken by Phillip MacQueen. It has been re-binned to 2x2 (correcting for the odd number of pre-scan pixels). The file is called bpm2.fits .

Flat fielding

There is a white light illumination flat field available on rhea in the calibrations/LRS directory. This was created by Joe Tufts and was based on a 1x1 binned images taken by Phillip MacQueen. It has been re-binned to 2x2 (correcting for the odd number of pre-scan pixels). The file is called ffha2_swbin.fits .

Wavelength calibration

NOTE: The LRS_g3 grism has a dispersion that runs in a direction opposite to the LRS_g1 and LRS_g2 grisms.

MRS Data Reduction Tips

HRS Data Reduction Tips

The FITS Extensions

The HRS CCD is actually 2 CCDs saved within a single fits image with fits image multi-extensions. The zeroth extension, test.fits[0] is the header that contains most of the telescope and instrument information. The first extension, test.fits[1], contains a short header with CCD specific information and the red HRS (MM1) CCD. The second extension, test.fits[2], contains a short header with CCD specific information and the blue HRS (MM1) CCD. IRAF has a package for dealing with fits extensions: mscred. Most useful of this package is the task mscsplit, which allows the extended image to be split into 3 separate iraf images with 3 separate headers. This allows each CCD to be reduced separately. Keep in mind that the telescope and instrument information will not be in the headers of each data frame.
There is now a script,
hsplit.cl which will split the sections and translate the header information from the [0] header into the [1] and [2] headers. This task can also correct some errors in the CCDSEC which will allow CCDPROC to work on HRS files. Please note that after a recent ICE upgrade there may be a small change required to the hsplit.cl code (look at the code for the documentation).

Bias

Bias frames are very useful for the removal of roughly half of the bad columns found on the red HRS CCD. We suggest that the 5 taken on your night be compared with a Master bias that you could create from all of your bias frames to see if there is any change from night to night. BIAS should be subtracted from the red HRS CCD. The blue HRS CCD is very clean and thus bias subtraction is of little assistance in the data reduction process and can actually add noise if an unclean (non master) BIAS is used.

Darks

Dark frames are very useful for the removal of a quarter of the remaining bad columns found on the red HRS CCD. However, the overhead involved in creating a dark frame for each binning and exposure length is too much of a burden for the current calibration plan. We suggest using fixpix to remove these bad columns.

Fixing pixels

Using the iraf task fixpix or other such tasks can aid in the reduction process particularly in the tracing of apertures on the red HRS CCD. However, this should be done with care since it is cooking your data (replacing bad columns with dubious good data).

Scattered Light Removal

In standard long slit or even slit spectroscopy some people do not find it necessary to remove scattered light because the scattered light will be removed with the sky subtraction. THIS IS NOT ALWAYS TRUE FOR FIBER SPECTROSCOPY AND WHEN FRINGING IS PRESENT IS NOT TRUE FOR LONG SLIT SPECTROSCOPY. Because of the potentially different fiber throughput differences there may be a scaling factor in the removal of the sky light which will inappropriately scale the scattered light. In addition, if the scattered light is not monochromatic it will not produce fringing. Thus scattered light removal should be done for all HRS spectra and done BEFORE flat field correction. In the IRAF echelle package the task for the removal of scattered light is apscatter. This task will fit a 2d surface to the scattered light. It first fits in the cross-dispersion direction leaving out any defined apertures. It then fits the surface to the dispersion direction. Earlier versions of IRAF sometimes fail to subtract the scattered light correctly unless the task is run interactively. I suggest testing your version in both the interactive and non-interactive modes.

Flat Fields

NOTE: Flat calibrations taken before Feb 6 2005 were contaminated by emission lines in the lamp itself. In the flat fields Li and Na emission lines are obvious. These should be removed using fixpix or a similar task. I have found that the first flat is often not contaminated and can be used exclusively for these regions.

Creating flats fields with the HRS is difficult and confusing. The first thing that you should realize is that these flats are internal to the HRS (a separate calibration fiber) and not taken through the science fibers. The flat will cover the same aperture as the science and sky fibers. There are several methods for creating a flat field and we will cover three:

apnormalize: This task requires that the user set apertures and trace the orders to the flat field. It can remove the blaze function but does not normalize in the cross-dispersion direction. Because the calibration fiber does not fill the slit in the same way as the science fibers, the flat will often have an inappropriate shape in the cross-dispersion direction in comparison to the science fibers. This has the effect of weighting you science data in an inappropriate way. We do not recommend using apnormalize.

flat1d: This task fits a spline to every line or column (column in our case). If you set the order of the spline to a very high value, say 91, then you will create a reasonable flat. This task has the advantage of being able to flatten near the edge of the CCD where only partial orders are seen. This flat will have some strange ringing at the edges of the flat aperture but is preferable to apnormalize. The task also has the disadvantage of smoothing out correlated pixel variations such as fringing at wavelengths above 670nm.

apflatten: This task requires that the user set apertures and trace the orders to the flat field. It can remove the blaze function and normalizes in the cross-dispersion direction. When the parameters are optimized this task can produce superb flats.

For all of the above it is important to pay attention to the flux levels and the S/N that will be derived. Because the pixel to pixel variation in the blue is small, a few tenths of a percent, and the throughput of the flat field lamp in the blue is low it is quite possible to actually harm your data by flatfielding with a noise flat. I suggest that the threshold parameter in the all of the above tasks be set such only data above 10,000 electrons in the summed flat be included. This threshold can be lowered in the red where the fringing can be worse than a 1% effect.

Finding Orders or Apertures

Because the HRS uses fibers the apertures are quite flat topped and thus marking the orders in the apfind ( an IRAF echelle package) task is difficult. We find better success by setting the nsum parameter to a much higher value, such as 30 to 100.

Tracing Orders or Apertures

Tracing the sky fibers is difficult because of the (hopefully) low flux levels. If you don't have a sky fiber and have a high S/N spectrum ( S/N > 50) then you probably won't have any problems tracing your orders particularly if you do a bias subtraction and either subtract a dark frame or fixpix the remaining bad columns. If you do have a sky frame then you have two options for tracing the sky fibers:

  • trace your science fiber and offset the apertures (using the s keystroke in the apedit task. You should do this in the ALL mode ( a keystroke) and look for the sky lines. You can interactively change the row that the apedit uses with a :line command.
  • use the sky flat or dome flat that should have been provided in the calibrations for your setup and then use those apertures as a reference apertures.
  • Background Subtraction

    In traditional slit spectroscopy the background region would be set for the removal of the scattered light and sky light in the extraction process. Because of the potentially different fiber throughput differences there may be a scaling factor for the removal of the sky light. Thus for best affect no background should be subtracted during the extraction and the sky subtraction should be done later. However, for quick and dirty first order extraction the background can be used to remove most of the sky light. Should you subtract sky is a separate question and will be addressed in the sky subtraction section later. To set the background region interactively for a single aperture in the apedit task with the keystroke b. The default parameters for the background can be set in apdefault. In addition to redefining the sample region, I suggest setting a few of the following parameters: b_naver=-1, b_niter=3, and b_high_=2.5. When you run the extraction, either apall or apsum, you should set the background parameter to either fit or median.

    Extraction

    Extraction in IRAF is done either in the apall or apsum packages. We recommend that sky subtraction be done separately so that proper accounting for the throughput differences in the fibers can be made, the background should be set to none. There are a number of bells and whistles that can be set to optimize the extraction, such as cleaning and weighting. If you chose to use these then the extras options should also be turned on. This will allow you to look at the unweighted, non-cleaned data in a separate band of the extracted spectra. You can help the cleaning of your data if you set the saturation parameter to be slightly higher than the strongest real feature (often your spectrum plus a bright sky line).

    Sky Subtraction

    The first question one should ask is should I be subtracting any sky. If you do not have a sky fiber then it is moot. If you do have a sky fiber but do not see any sky then you should not subtract the sky. If you do have a sky fiber but only see sky emission lines then you may or may not want to subtract sky. You need to determine if you are in the read noise or the poisson noise limited regime. Even though the read noise of the MM1 CCD is low (~4 electrons) there are lots of pixels to sum over. Unbinned there are 20 pixels across the 3as fiber. That is a lot of read noise. Thus you should consider if subtracting the sky to get rid of a few sky lines is going to add a lot of read noise to your spectrum. If you are working in the blue on faint objects the answer might well be yes! Sky subtract with caution.

    If you do decide to subtract the sky fiber you should extract the sky in the same way that you extract you spectrum. You should have the sky subtraction parameter in the apsum (or apall) turned off and the aperture should be the same width as your science fiber. In fact I suggest that you use the science fiber as a template. Here is a potential set of steps:

  • apedit and aptrace the science aperture in the sky or dome flat
  • using the flat as a reference apedit the science spectrum and check that the aperture looks good on the science fiber in all lines
  • change any of the science aperture widths interactively
  • extract the science spectrum
  • using the science spectrum as a reference apedit and aptrace the flat again and offset to the sky fiber
  • copy the science spectrum to a new file
  • using the flat as a reference apedit the renamed science spectrum and check that the aperture looks good on the sky fiber in all lines.
  • extract the sky spectrum.
  • You will probably have to scale the sky spectrum flux to that of the science fiber. This scaling is the reason why you do not want to use the sky subtraction to remove things that do not come from the sky, e.g. scattered light, bias and dark current. To get a good idea of the relative throughput of the science and sky fibers I suggest that you look at the flux in the dome or sky flat (NOT THE INTERNAL FLAT). In theory you should just have to multiply the sky fiber by a constant to get the subtraction correct.

    It is possible (although very tricky) to subtract the sky lines and not the read noise (assuming that there is not continuum sky light in your sky fiber). To do this you will need to remove the same blaze function from the sky and the science fiber. Then you copy the science frame to a second science from where you reject all values below a certain level and set them to zero. This can be done with imcombine. It is very delicate procedure with limited uses but it might improve the quality of your data. I suggest that before you try this, you try out not subtracting the sky at all!

    Wavelength Calibrations

    Getting the wavelength calibration is tedious but quite straightforward. I use the ecid task and end up with a solution with xorder=7 and yorder=5, although I start at a much lower order until I have identified a lot of lines.

    See HRS wavelength calibration for a ThAr identifications by spectral order.

    JCAM Data Reduction Tips



    Last updated: Wed, 22 Feb 2012 23:26:27 -0600 shetrone



    Phase III

    How to Retrieve Data

    Finding Calibration Files

    Updates to Your Target List

    Data Rejection Policy

    FAQs

    Data Reduction Tips

    Filling Factor Estimator