Hey,
This is a great topic and appears in various forms often. Here's my take on astrophotographic noise and its reduction. In keeping it simple, I'm sure I will make a generalization or two that can be argued with... but with a larger goal in mind, here goes.
In low light photography, there are many sources of image noise, (dark noise, bias noise, photon noise, shot noise, thermal noise, ISO noise, stuck pixels, etc). All are things that are unwanted (hence noise), but they all boil down to two fundamental types: true random noise, and pattern noise. There are many ways to label and describe these two, but a convenient pair of labels to me is non-correlated and correlated noise.
True random or non-correlated noise is what it says: it truly is a random, totally unpredictable sequence of values. True random values are non-correlated: there is no signal present, nothing that repeats with a pattern. No matter how long you look at it, nothing repeatable emerges. Taking advantage of that fact, the more times you average the values, the more you reduce the value of random noise. In order to half the noise, you need to take 4 times the averages. (EDIT: I originally stated "squaring the averages, halves the noise", wrong! Noise is reduced by the inverse square root of the number of averages.) ISO noise is pretty much pure random in that it varies across the image from pixel to pixel and varies at each pixel image to image.
Pattern or Correlated noise may appear to the casual observer to be random, but if you look at it long enough a repeating or correlated value emerges. Many correlated noise sources have a true random component superimposed on a correlated or repeating signal. If there is no random component, the repeating or correlated portion of the signal appears immediately. A stuck hot or dead pixel is a very obvious signal with no random component. You see it in each and every image. No matter how many times you average, it just stays the same. It is correlated between frames. you can't average it away.
An interesting example of pattern noise is "poisson noise" such as the number of letters you get in a week. Over a period of a year, if you count up all the mail you get, and got 572 pieces, it would average to 11 pieces of mail per week. (572/52) During the year you might get 9 pieces one week and 13 pieces the next, averaging to 11 per week. Following weeks might be 35 pieces one week, 25 another, and 0 three weeks in a row and 6 the last. Again averaging 11 pieces per week, even though you never actually got 11 pieces in a week. No matter how long you observe, (well, as long as you watch for at least a few weeks) it will always tend to work its way to an 11 piece average. This is an example of pattern noise that has a random component, but a signal hiding below.
Lets look at a few typical noises you experience in an astro image:
A satellite streak: Certainly obvious, predictable and correlated from pixel to pixel in a single frame. But, it doesn't appear in any subsequent frames, so it can be considered "correlated within a frame" but "non-correlated across frames". Averaging a bunch of images will eventually remove or minimize the satellite. However, if it was a REALLY bright Iridium flare, it's gonna take awhile to be removed. As Samir pointed out, there are better techniques for removing satellites and stuck pixels.
Thermal noise: one of the biggies in astrophotography. At first glance, this can appear random across a single image, and even random at a single pixel for a few images. But it is similar to poisson noise: there is an underlying signal caused by a temperature reaction unique and repeatable at each pixel. For a given exposure and temperature, the underlying thermal noise has the same fundamental value, but it varies slightly from image to image. So there is a random component between images, but a fundamental value that is always there. If you average enough images together, like the mailbox, the random component averages towards zero and the fundamental dark value remains. Averaging removes the random component but only reveals the correlated pattern noise beneath.
Okay, now that we know the kinds of noise, we can attack how to remove them. A different technique is required for each.
Pure random noise is removed by averaging a bunch of images together. The more averages, the less noise. However, any signal that is present (desired ones like a star and undesired ones like thermal noise) remain and emerge clearer and clearer.
So averaging gets rid of random noise. How do we get rid of correlated or pattern noise? With dark frames or dithering.
Dark frames: when you shoot a series of dark frames at a single temperature and exposure length and average them, the random component averages towards zero but the thermal and pattern noise emerges as a signal. The same signal that is present in all your sky exposures. If you subtract an averaged dark frame from a light frame, the pattern component is removed from the image. Obviously a single dark frame does not do a lot of good since it still contains the frame-to-frame random component. Hence, that's why we take 10, 25, 50 or whatever dark frames. They have to be at the same temperature and exposure length as the lights since the correlated dark signal changes as the temperature and exposure length changes. And, to top it off, thermal noise is non-linear. A change in temperature affects different pixels differently by different amounts. Shucks. Gotta have a suite of dark frames.
Dithering: This is a cool example. As explained by other posts in this thread, dithering is the act of shifting the sensor position a few pixels between images. Therefore the image appears shifted across the sensor. As long as you dither with random amounts 3 pixels, then 5 pixels, then back 7 pixels, then 4 pixels etc... the pattern noise which was correlated between images before is no longer correlated when shifted some amount each time. Since the dark noise is no longer correlated, averaging a series of images removes that noise.
Summary: Averaging removes random noise. Dark frame subtraction removes pattern noise. Dithering converts pattern noise into random noise so averaging begins to work again.
Back to the original post: Whew. After all that, if you do random dithering between each frame, then average a series of re-aligned frames, both random and pattern noise is reduced. So you don't HAVE to subtract darks too. That being said, if you subtract dark frames from each light AND dither, you will reduce the pattern noise faster, so for any given number of averages, you should end up with still less noise. There is benefit to doing both, although dithering alone really helps. However, that assumes you take the time to make good darks. If the sensor temperature is significantly different in your darks than when you shoot lights... you are actually introducing further pattern noise.
Interesting corollary: I shoot a lot with my CPC1100 on an alt-az mount. I can generally not shoot more than 30 second frames (rule of thumb) because of field rotation. Ahhh! Field rotation is similar to dithering! If you shoot 50 frames with field rotation between each one, align them and average them, you effectively remove the pattern or thermal noise for the same reason dithering works. However, you statistically remove less in the middle of the image than around the periphery where the apparent motion is higher.
So, I often get some nice results when shooting alt-az, even without using dark frame removal.
Double whew! Long winded. But I fell better now.
Best regards,
Jimmy the Geek, in Boulder
|