cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Ability of DSLR cameras to cut down UV and infrared light

decisivemoment
Contributor

Is it a fact that most digital cameras can cut down ultraviolet and infrared parts of the electromagnetic radiation on their own and therefore do not require separate filters for that purpose? If so, even an entry level DSLR, like my Canon EOS 1100D, is so capable? 

1 REPLY 1

TCampbell
Elite
Elite

Yes, this is true.  In fact... more than you might guess.

 

In the days of film (I speak as if it's ancient history but that's how many of us here grew up shooting) the sensitivity of the film to various parts of the spectrum dependended entirely on the film itself.  If you wanted to shoot infra-red, you loaded the camera with infra-red film.  Non infra-red film was not particularly sesntivie to infra-red light.  

 

Many films were mildly sensitive to UV light and the real point of the UV filter (which I think might surprise most people) is that whenever you pass light through a "lens", the lens has a "dispersion" effect which bends different wavelengths of light by a different amount.  Compensating elements are used in the lens to attempt to correct for this and better lenses are judged substantially by how well they handle this.  But it turns out human eyes are not sensitive to either UV (nor IR).  Since UV is on the short end of the spectrum, it's light bends more than any other wavelength and would thus need the most correction.  If left uncorrected, this might cause an image to appear slightly "soft" because the UV wavelength does not want to focus at precisely the same focus distance.  Rather than "correct" the UV wavelength, it's much easier to just eliminate it entirely by using a filter -- problem solved.

 

Back to the era of digital...

 

Digital cameras are senstiive to both UV and IR and much moreso than film.  Effectively _every_ digital camera needs to filter these wavelengths out.  But more than this... human eyes are not actually 'equally" sensitive to all colors of light.  We are far more sensitive to the wavelengths in the middle of the visible spectrum and not nearly as sensitive to the wavelengths nearing the sides of the visible spectrum.  You are, in effect, VASTLY more sensitive to "greens" and much less sensitive to either red or blue.  So while you see the world with reds, greens, and blues... the real world has significantly more red and blue than you are able to see.  Cameras would see these more or less equally and this would result in images which appear to be saturated with red and blue.  

 

To fix this, the camera does a few things...

 

First... the sensor is coated with "photosites" under a mask of red, green, and blue micro "filters".  The array is usually referred to as a "Bayer Mask" (there are other types but this is the most common.)  The mask has twice as many "green" sensitive sites as it does red or blue sites.  

 

Second... the camera has both UV and IR filters inside and these are located immediately in front of the sensor (but behind the shutter).  If you remove the lens, open the shutter, and look in side, then what you actually are looking at is a couple of layers of filters (which appear mostly clear to you) and then the sensor itself is behind them all.

 

Third... the UV and IR filters have a gradual cut-off curve.  The IR wavelength begins at 700nm (and longer)  Anything "shorter" than 700 is considered 'visible' light.    The visible spectrum runs from roughly 400nm through 700nm.  But it's not as if the filter allows 0% of wavelengths shorter than 400nm to pass and 0% of the wavelengths longer than 700nm to pass and yet 100% of the stuff between 400 and 700nm through.  What it _really_ does is a gradual ramp.  

 

The filter begins cutting IR light starting while it's still in the 500nm range -- it's already being filtered in the 600nm range.  By the time we get to the 656nm wavelength (which happens to be a fire-engine red color -- that's the "Hydrogen alpha emission line" the filter is already cutting nearly 75% of the "red" light (and this is still considered "visible" light spectrum territory).    By the time it gets to the 700nm wavelength it's cutting nearly all of it.

 

This may sound surprising or undesireable but actually... it more mirrors the sensitive amounts of your own eyes.  Your eyes are less sensitive to those wavelengths so the camera is deliberately de-tuned to be less sensitive to those areas as well and in doing this, you end up with a photograph which more closely approximatelys the color proportions that your eyes are used to seeing.

 

There are  people who modify their cameras so they can be used for IR photography and astrophotographers love to modify them for astrophotography purposes (the Canon 60Da is a special edition of the 60D which is pre-modified by Canon to be more sensitive to the wavelengths needed in astrophotography which benefits greatly if sensitive to the Ha (Hydrogen alpha) wavelength (656.28nm) is improved.)

 

Thats probably more than you wanted to know about the filters... but there you have it!

 

There is no "need" to use a UV (nor IR) filter on any of your lenses.  Some people use them as a form of "protection" (to avoid getting dirt directly on the front element of the lens).  There is a downside to useing them in that they often will create optical reflections in your image ... with "ghosting" or "flaring" that would not be visible had no filter been on the lens.

 

Regards,

Tim

 

Tim Campbell
5D III, 5D IV, 60Da
Avatar
Announcements