Image Differencing (aka Ratioing)

Image differencing (also referred to as ratioing) is a commonly used technique in scientific image processing. This technique is typically applied to identify and enhance details very close to the "noise floor" of an image. (The noise floor, by the way, is the level of contrast where a real detail becomes indistinguishable from a detector's inherent noise.) Many observations are inherently low contrast, because an observed feature may only contribute a tiny fraction of the overall light entering a camera. Some examples of this are dust devils observed by Curiosity (only about 1% brighter than the background), the surface and lower-atmosphere clouds on Titan (between 2-20% difference from background), or identifications of water on the Martian surface (often less than 0.5% of the light entering the camera).

Image differencing uses two photos of the same object taken either at different times or at different wavelengths. One of these images, the reference image, provides a good representation of the overall scene but does not contain a feature of interest. The other image, or source image, may contain a low-contrast object that we are looking for. The goal of comparting these two images is to determine which pixels get lighter or darker, highlighting the feature of interest. A changes in brightness is assigned a number equal to the absolute difference in brightness between two pixels. So, if a pixel does not change brightness, it receives a value of 0 (black). If it changes brightness by 10 values, it receives a value of 10 (dark gray). Most noise is on the level of 2-3 brightness values, so if you filter out everything below those values, you're left with the signal of interest. You can then enhance this signal by contrast stretching before adding it back to the original differenced image. This has the effect of increasing a feature's contrast without also enhancing the baseline noise of the image.

A slightly more specific form of differencing is called subtraction. In this process, all pixels from the source image that are equal to or darker in brightness than the reference image are set to zero. This is useful in situations where you are sure the entire signal you are processing for will be brighter than the reference image, and can help avoid artifacts introduced by differences caused by unwanted signals. An example of this is used in the tutorial for processing Mars Express cloud images, where brightness differences related to clouds (always brighter in the source) are wanted signal, and those due to surface markings (always much darker in the reference) are unwanted signal.

A generalized workflow for image differencing:

1. ID a good reference image. For cases where an atmosphere is involved (cloud details, surface features, etc), the reference image should be one taken through a filter where the atmosphere is as bland as possible. This will provide a good model of the lighting conditions. For finding changes on a surface, this simply needs to be images of the surface taken at two (or more) separate times.

2. Align the images and apply differencing. Most photo editing suites with layer-based overlay options can perform this task. They will have a drop-down menu called something like "Blending Mode", with "Difference" and "Subtraction" as options.

3. Process the differenced image. Many details in a differenced image will be very dark, especially when changes are small. Filter out noise (typically by using Levels to set obviously noisy pixels to black), then apply contrast enhancement and noise reduction to the remaining data.

4. Merge differenced image back to the original. There are different ways to do this, but I typically use blending modes such as "Add" or "Overlay" to accomplish this. After this step, you should have a picture where a barely-visible low contrast detail is now an obvious one.