image industrial radiography

Available techniques of industrial radiography image forming

In industrial radiography, the usual procedure for producing a radiograph is to have a source of penetrating (ionizing) radiation (X-rays or gamma-rays) on one side of the object to be examined and a detector of the radiation (the film) on the other side as show in figure 1-1. The energy level of the radiation must be well chosen so that sufficient radiation is transmitted through the object onto the detector.

The detector is usually a sheet of photographic film, held in a light-tight envelope or cassette having a very thin front surface that allows the X-rays to pass through easily. Chemicals are needed to develop the image on film, which is why this process is called the classic or “ wet” process.

Nowadays, different kinds of radiation-sensitive films and detectors not requiring the use of chemicals to produce images, the so-called “dry” process, are used increasingly. These techniques make use of computers, hence the expressions; digital or computer aided radiography (CR) or direct digital radiography (DR).

A DR related technique that has been available for many decades is the one in which images are formed directly with the aid of (once computerless) radiation detectors in combination with monitor screens (visual display units: VDU’s). This is in fact an early version of DR. These through transmission scanning techniques (known as fluoroscopy) the storage of images and image enhancement are continually improved by the gradual implementation of computer technology. Nowadays, there is no longer a clear division between conventional fluoroscopy with the aid of computers and the entirely computer-aided DR. In time DR will, to some extent, replace conventional fluoroscopy.


How do I determine the radiation energy-hardness? 

Basic set-up for film radiography

Summarising, the image of radiation intensities transmitted through the component can be recorded on:

The conventional X-ray film with chemical development, the “ wet” process, or one of the following “dry” processes:

  • A film with memory phosphors and a work station for digital radiography, called computer-assisted radiography or CR.
  • Flat bed detectors and a computer work station for direct radiography, called DR.
  • A phosphorescent or fluorescent screen (or similar radiation sensitive medium) and a closed-circuit television (CCTV) camera as in conventional fluoroscopy, an early version of direct radiography
  • By means of radiation detectors, e.g.: crystals, photodiodes or semiconductors in a linear array by which in a series of measurements an image is built up of a moving object. This method is applied in systems for luggage checks on airports.

The source of radiation should be physically small (a few millimetres in diameter), and as X-rays travel in straight lines from the source through the specimen to the film, a sharp “image” is formed of the specimen and discontinuities. This geometric image formation is identical to the shadow image with a visible light source. The sharpness of the image depends, in the same way, on the radiation source diameter and its distance away from the surface on which the image is formed.

The “classic” film in its light-tight cassette (plastic or paper) is usually placed close behind the specimen and the X-rays are switched on for an appropriate time (the exposure time) after which the film is taken away and processed photographically, i.e. developed, fixed, washed and dried. In direct radiography (DR), a coherent image is formed directly by means of an computerised developing station. The two methods have a negative image in common. Areas where less material (less absorption) allows more X-rays to be transmitted to the film or detector will cause increased density. Although there is a difference how the images are formed, the interpretation of the images can be done in exactly the same way. As a result, the DR- technique is readily accepted.

The “classic” film can be viewed after photochemical treatment (wet process) on a film viewing screen. Defects or irregularities in the object cause variations in film density (brightness or transparency). The parts of the films which have received more radiation during exposure – the regions under cavities, for example – appear darker, that is, the film density is higher. Digital radiography gives the same shades of black and white images, but viewing and interpretation is done on a computer screen (VDU).

The quality of the image on the film can be assessed by three factors, namely :

  1. Contrast
  2. Sharpness
  3. Graininess

As an example, consider a specimen having a series of grooves of different depths machined in the surface. The density difference between the image of a groove and the background density on the radiograph is called the image contrast. A certain minimum image contrast is required for the groove to become discernible.

With increased contrast:

a. the image of a groove becomes more easily visible

b. the image of shallower grooves will gradually also become discernible

Assuming the grooves have sharp-machined edges, the images of the grooves could still be either sharp or blurred; this is the second factor: image blurring, called image unsharpness.

At the limits of image detection it can be shown that contrast and unsharpness are interrelated and detectability depends on both factors.

As an image on a photographic film is made up of grains of silver, it has a grainy appearance, dependent on the size and distribution of these silver particles. This granular appearance of the image, called film graininess, can also mask fine details in the image.

Similarly, in all other image forming systems these three factors are fundamental parameters. In electronic image formation, e.g. digital radiography or scanning systems with CCTV and screens, the factors contrast, sharpness and noise are a measure for the image quality; pixel size and noise being the (electronic) equivalent of graininess (pixel size).

The three factors: contrast, sharpness and graininess or noise are the fundamental parameters that determine the radiographic image quality. Much of the technique in making a satisfactory radiograph is related to them and they have an effect on the detectability of defects in a specimen.

The ability of a radiograph to show detail in the image is called “radiographic sensitivity”. If very small defects can be shown, the radiographic image is said to have a high (good) sensitivity. Usually this sensitivity is measured with artificial “defects” such as wires or drilled holes.