Common photography myths (5)

Myth #4: Infrared films record thermal radiation

September 5th, 2009 - 08:33:30 PM:

Simply put, they don't. You can't take a picture of your house with IR film to see where the insulation is leaking. If that worked, the temperature of the camera, of the film can and of the film itself would expose the film. For thermal imaging you usually use completely different devices, mostly electronic video cameras whose imaging sensor is cooled to very low temperatures so that it doesn't cause exposure to itself.

Infrared film is not much different from normal film. It's just also sensitive to light at wavelengths longer than visible red. It's sensitive into the infrared range of light. Thermal radiation has wavelengths much longer than what IR film is capable of recording.

Typically, IR films are sensitive to wavelengths between 400 nm and 800 nm–900 nm. Regular film is sensitive between about 400 nm and 660 nm. An object will emit light with a wavelength of 900 nm when it's almost glowing hot!

With IR films you still need a light source, e. g. the sun, and you record the light that is reflected from objects in the scene. Also, when you want only IR light to expose IR film, you have to use filters that block out almost all other wavelengths. Otherwise the results from monochrome IR films doesn't look much different than those from normal monochrome films.

There are a number of general statements about photography passed off as “the truth”. They are repeated again and again in introductory texts about photography and on the Internet. Repetition, however, doesn't make a false statement true. Here are the most common myths I've encountered: