what is a thermal imaging camera
A thermal camera is a non-contact device that detects infrared energy (heat) and converts it into a visual image. Let’s dive into the science of thermal cameras and the invisible world of heat they allow us to see.
The created image represents the temperature of the object. The underlying technology of the thermal imaging cameras was first developed for the military. However, the invention of the thermal camera is related to the history of thermography which began in 1960 by Sir William Herschel an astronaut who discovered infrared light.
In 1860, American astronomer Samuel Pierpont Langley invented the bolometer which is a device that measures infrared or thermal radiation. And in 1929, Kálmán Tihanyi, Hungarian physicist invents infrared-sensitive electronic television camera which was capable of capturing thermal images.
Both infrared radiation and visible light are part of the electromagnetic spectrum, but unlike the visible light, Infrared radiation cannot be perceived with human eyes directly. Which explains why a thermal camera is not affected by the light and it can give a clear picture of an object even in a dark environment.
Thermal imaging is all about converting that infrared light into electric signals and creating an image using that information.
This technology was revolutionary at the time, but it is in common use today. But, how do these devices manage to capture this invisible visual information? Let’s check it out.
How Do Thermal Cameras Work?
Infrared energy has a wavelength starting at approximately 700 nanometers and extends to approximately 1mm. Wavelengths shorter than this begin to be visible by the naked eye. Thermal imaging cameras use this infrared energy to create thermal images. The lens of the camera focuses the infrared energy onto a set of detectors that then create a detailed pattern called thermogram. The thermogram is then converted to electrical signals to create a thermal image that we can see and interpret.