Friday 7 November 2014

Forensics : Introduction to Gaussian Analysis

Resolving and 'Dissolving' data 
While tuning a radio, static (signal noise) will give way to a barely coherent signal, followed by progressively less static until a clear signal is resolved.  Continuing to tune beyond this point will produce an opposing pattern of increasingly poor signal to noise ratio, until once again all that remains is static.  This is a typical Gaussian or Normal signal distribution.

A digital image is also created from electromagnetic radiation, and the various parameters that go into making up a digital image also, for the most part, follow Gaussian distributions.  Of the five image quality parameters used in the Image Quality Tool, only Resolution works rather differently to the rest.  In theory there is no limit to the maximum resolution of a digital image, though beyond a certain point lens quality starts to max out image sharpness so image quality reaches a ceiling.

The other parameters, focus, exposure, white balance, and the occurrence of digital artefacts all largely follow a Gaussian distribution, as illustrated below.


Not only do these quality parameters share a similar distribution around a 'normal' or optimum image, many of them transform images in a very similar way.  Reduced resolution, defocus and both under and over-exposure all have the effect of basically 'dissolving' images in one way or another.  The result is a gradual loss of image definition and acutance (edge contrast), tonal range, until ultimately detail and colours become obscured.

I believe that it may be possible to study the Gaussian patterns of each of these image quality parameters and find characteristic signatures that we can then use to estimate the level of image data loss occurring.  For example, in a photograph where bright sunlight pushes dynamic range beyond a certain point, it might be possible to establish, based on the degree of brightness of pixels in an area of an image, the likely level of image detail loss that might be occurring in that area.  Or, based on the level of defocus in an image, we might be able to establish if a feather fringe is likely to have been obscured or not.

Obviously, if data is lost or never existed in the first place we can never retrieve it, or be 100% certain it existed.  But perhaps we can still make useful inferences based on the conditions under which the image was produced.  For now I am just introducing the concept.

No comments:

Post a Comment