cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

The future of Cameras and Lens Mounts

Tronhard
VIP
VIP

This is a continuing debate that I see in the various fora to which I belong.  Just to add a bit of spice to the whole thing Tony Northrop has produced his take on what the future holds.  Now, I realize that he has some controversial views on elements of photography, but apparently he was a director of marketing for a fortune 100 company and accuratlely predicted both the challenges around 2000 and the demise of his own company.  Business analysis is his thing...

 

Anyway for what it's worth, here is a link to his presentation: Are DSLRs Dead?

 

Doubtless this will invoke some reactions from our own members Man Wink


cheers, TREVOR

The mark of good photographer is less what they hold in their hand, it's more what they hold in their head;
"All the variety, all the charm, all the beauty of life is made up of light and shadow", Leo Tolstoy;
"Skill in photography is acquired by practice and not by purchase" Percy W. Harris
28 REPLIES 28


@altco wrote:

 

I don’t fully understand how this work but, for a certain size of the sensor, the larger number of smaller pixels seem to offset the decrease in signal quality from each smaller photosite.

 


Stop watching Northrup’s videos.  Forget every technical explanation of how things work he has ever made.  The following is all you really need to know about image sensors in cameras and light.  Are you familiar with a muffin tin?

 

B6794A64-416F-4B21-A1F6-1D72704455DD.jpeg

 

In a way, they are similar to an image sensor.  Nearly all of those tins are the same size, but the size of their cups vary from small to medium to large to extra large.  An image sensor is a flat surface covered with little photosites that can collect water.  Now, hold that thought.

 

Llight falling on an image sensor is a lot like rainfall.  If exposed to rainfall, the cups begin to fill with water.  The rate of the rainfall is a constant, though.  If rain is falling at one quarter inch per hour, each of those cups will have a quarter inch of water inside of it after an hour.

 

It does not matter how large or how small the muffin cup is, it will collect only one quarter inch of water after an hour.  The larger cups will collect more water, but the depth of water in every cup will be the same one quarter of an inch.  Now imagine if you had a garden hose with a hand valve sprinkler head on it, to turn water on and off.

 

93762FBD-9ABF-4705-9346-608B15DE530A.jpeg

 

Your camera shutter controls how long light can fall onto the image sensor.  That sprinkler valve head that turn water on and off is just like your camera shutter.  One allows light to pass, while the other allows water to pass.

 

Notice that the sprinkler heads have a ring that allows you to control how much water passes through the head.  This is very much like how your aperture works.  Your aperture controls how much light is capable of passing through the lens to the image sensor, just like the sprinkler head is capable of controlling how much water passes through the head.

 

How much water passes through the sprinkler head depends upon the water pressure.  How much light passes through the lens to reach the image sensor depends upon the intensity of the light.  A light meter measures the intensity of light.

 

A light meter is capable of taking that intensity measurement, and calculate a shutter speed and aperture setting.  Imagine you stand over the muffin tin with the sprinkler head in your hand.  You want to fill the cups, but not over or under fill them.  


Back to the muffin cups.  Water from the sprinkler will fall on the muffin tin at a steady rate filling each cup.  Here’s the crucial part.  The larger cups will collect a greater volume of water because they are wider, but the depth of water in each cup will be the same as time passes.  The higher the water pressure, the faster the cups fill, and the shorter length of time the valve needs to be turned on.

 

This is exactly how the individual photosites collecting light on an image sensor collect light.  Light is allowed to fall on the sensor for a fixed length of time.  The aperture is used to control the RATE of the flow, but it is the intensity of the light that determines the overall volume of water that gets collected.

 

When a photosite collects water, you empty it out, just like a glass of water.  But, a small fraction of water always gets left behind in the glass.  The volume of wasted water is a constant, no matter the size of the glass.  Suppose the first ounce of water in the glass is noise.  Losing 1 ounce in a small 8 ounce glass is more significant than losing 1 ounce in a 24 ounce glass.

 

Photosites are the same way with light.  A small fraction of the collected gets lost as noise.  The larger volume of light collected by large photosites translates directly into more dynamic range and contrast in the final image.

 

Conclusion.  Everything that I just said is true no mater the size of the muffin tin, or the image sensor.

--------------------------------------------------------
"Fooling computers since 1972."

Stop watching Northrup’s videos.

 

Maybe, at some point. For now i need to set my gauge by watching lots of technically competent pictures, along with brief commentaries from someone who knows what makes an image good or bad.

 

Conclusion.  Everything that I just said is true no mater the size of the muffin tin, or the image sensor.

 

This analogy is indeed capable of giving an intuitive sense of the inner workings of a light sensor and you did a good job at eliminating one of the most misleading elements from it (some people claim that the noise is just the chance of having like 10 drops of water in one cup vs. 8 drops in another; a photosite collects 1000s of photons and this variance plays only a minor role)

What escapes my understanding is the difference between simply averaging some adjacent cups/photosites vs. true hardware binning. If Northrup’s claims are wrong, we are just one step away from true equivalence between optical and digital zooms


@altco wrote:

Stop watching Northrup’s videos.

 

Maybe, at some point. For now i need to set my gauge by watching lots of technically competent pictures, along with brief commentaries from someone who knows what makes an image good or bad.

 

Conclusion.  Everything that I just said is true no mater the size of the muffin tin, or the image sensor.

 

This analogy is indeed capable of giving an intuitive sense of the inner workings of a light sensor and you did a good job at eliminating one of the most misleading elements from it (some people claim that the noise is just the chance of having like 10 drops of water in one cup vs. 8 drops in another; a photosite collects 1000s of photons and this variance plays only a minor role)

What escapes my understanding is the difference between simply averaging some adjacent cups/photosites vs. true hardware binning. If Northrup’s claims are wrong, we are just one step away from true equivalence between optical and digital zooms


Actually, the best way to learn is by getting out and taking photographs.  The inner workings and details of what goes on under the hood are good to know, but you seem to have reached the point where it has become an impediment.  In my opinion, the science of how the mage sensor works is proprietary knowledge, and anything I see and hear on the internet is mostly speculation and conjecture.  

 

Translation:  it is a #dontcarecondition.  It is outside of my immediate scope.  It is not something that I really need to know in order to use the tool effectively.  

 

I drive a car everyday.  I have no idea of the details of how the transmission works, how it is able to switch from 2WD to 4WD automatically when the tires begin to slip in the snow.  As long as it works well, then that is really all that I need to know and concern myself with.  Designing transmissions is something entirely different from simply driving a car.

--------------------------------------------------------
"Fooling computers since 1972."

"Stop watching Northrup’s videos."

 

I don't know if stop watching his vids is the answer, probably a good idea, but he is totally wrong on this and some other things. It seems to have confused you completely.  The muffin tin and rain is a fantastic, albeit simple, way to demonstrate how it works. A moment of brilliance from Waddizzle actually.

 

I think Tony Northrup found it is easier to make money doing the vids, books, than actually working as a photographer. Don't watch his vid on aperture! It's way off.

EB
EOS 1DX and 1D Mk IV and less lenses then before!

Thanks for adding some highlights to my otherwise underexposed response to Waddizzle; i certainly don't like it when the posts from a photography forum end up being specimens from the BW low DR era


@altco wrote:

Right, the term ‘amplification’ is a bit sloppy in this context; the electronics used to measure the electrical charge generated by a pixel is not your classical amplifier. The electrical charge generated by the photosensitive  area needs to be transformed into something that can be measured by a DAC, and tuning the level of this ‘something’ to the level expected by the DAC is not usually referred to as amplification

However, the energy of the signal coming from a pixel is still a function of the area of the pixel; for a given illumination, a small .1’’ pixel generates 10^6 less electrical charge as compared to a 10’’ pixel. Whatever the electronics employed within a sensor, a smaller pixel needs more light if it is to measure an equal amount of electrical charge


Amplification has nothing to do with it.  Northrup’s claims on this topic is textbook bad science. What matters is the size of the photosite, which has nothing to do with the size of the sensor.  While it is true that a full frame sensor can have larger photosites than an APS-C sensor, there are instances where the reverse can be true.  

 

The 18MP Rebel T6 has pixels that are 4.3 micrometers on an APS-C sensor.  The EOS 5DS has 4.1 micrometer pixels on a full frame sensor.  Curiously, both camera bodies have similar ISO ranges.  

Also, Northrup’s claim completely ignore sensor technology, which can make a dramatic difference in ISO range.  Drawing conclusions based upon only sensor size leads to conclusions that have little to no basis in fact, because he ignores the most crucial facts.

--------------------------------------------------------
"Fooling computers since 1972."

" Northrup’s claim completely ignore sensor technology, which can make a dramatic difference in ISO range."

 

Which if you take that to the extreme would you rather have a 1D Mk II with its huge pixel photosites or a 1Dx with much smaller sites? Of course this is silly because no one would take a Mk II over a newer 1Dx but it does demonstrate newer tech is always (usually!)  better.

 

Northrup blew this one. (BTW, he does so on some others too!) I had to quit watching or reading him.

EB
EOS 1DX and 1D Mk IV and less lenses then before!


@altco wrote:

>>> asking if he had ever seen a light meter with a crop factor compensation

 

As i understand it, Northrop claims that the physical measure of interest is the amplification of the signal from the sensor, while ISO is just an indirect indication of this measure.

 

A light meter gives an indication in terms of ISO sensitivity and, because the ISO sensitivity is in fact just calibrated amplification for the particular sensor/crop, the light meter does not need further compensation.


There is a very simple reason why Northrup’s claims are complete and utter nonsense.  A DSLR uses a separate sensor for metering, not the actual image sensor.  Light meters do not have a crop factor; never have and never will for this exact reason.

--------------------------------------------------------
"Fooling computers since 1972."

ebiggs1
Legend
Legend
Totally agree
EB
EOS 1DX and 1D Mk IV and less lenses then before!
Announcements