Image Processing

The galaxy UGC1382 - from initial 3 colour image, to an enhanced image
showing that what appeared to be an elliptical galaxy is actually a spiral.
Credit: NASA/JPL/Caltech/SDSS/NRAO/L. Hagen and M. Seibert

When we use telescopes to take images of objects in space, we are actually taking data. This data comes back in it's raw form, and in order to interpret what is going on in the observations we need to process them and analyse the images.

Our digital cameras do this process for us, but we need to do this ourselves. There are several stages involved in processing a raw image before we can analyse it, some of these are described below:

  • Debiasing - often astronomical instruments have a signal put into them which we call a bias, this is in order to have a base-line signal in the instrument, but this needs taking out of the image when we need to see the signal from the object we are looking at.
  • Flat Fielding - when we take an image of an object in the sky we find that the sky does not have a flat background colour, or brightness. The brightness can change across the sky, and this change can be different depending upon which colour or band you are looking in. We therefore try to map this every twilight and remove the differences on our 'science' image.
  • Calibration - when we take observations we need to measure the brightness of an object, or in the case of spectroscopy we need to know the wavelengths we are looking at. To do this we take extra observations of objects for which we already know there brightness or, the wavelength range we are looking at. This could be a standard star which is one which doesn't change in brightness, or a lamp with a particular element in. We can then calibrate our observations so that we can get our own measurements.