10-07-2017 10:32 AM
Look at DXO Mark. While their "overall rating" is subjective and opaque, their ratings on low light appear to be based on some objective criterion.
Those numbers are totally meaningless to me. There is nothing on their web site to explain what they are. Not even the article titled "DxOMark camera sensor testing protocol and scores" offers a hint at what the numbers mean.
All I see are a number in a box, and a number in a circle. Pretty.
10-08-2017 10:19 AM
"... at least presumably based on some consistent threshold of image quality..."
Scotty the problem is this is a lab. It's testing is done on consistent testing equipment. Whether it is a lens or a sensor the real world is not consistent. If you want to shoot DXO test charts, their numbers may be valid. The world around you is not a test chart. And the last time I even looked at DXO, they did not reveal how they arrive at the numbers. Just the results.
Another boondoggle, IMHO, is MTF charts. Nice for shooting MTF charts but tells you little else. Some people love this kinda stuff and spend hours comparing it. That's how DXO stays in business I guess.
10-08-2017 10:58 AM
Yeah,I don't like DXO, and I qualified my statement by saying their methodology is never made clear and that their famous overall composite score is unreliable because no one knows how they arrive at the number. I do think though there is likely some utility in their comparison between one camera and another on the single "sports" rating, because they say it is rating only one thing, which is the highest ISO at which the sensor can deliver some benchmark of image quality that is the same for every camera tested.
10-08-2017 12:38 PM - edited 10-09-2017 05:20 AM
Check out their article, "DxOMark camera sensor testing protocol and scores". It seems to explain one number, but not the number in the circle.
"Portrait" - Seems to be a rating of color sensitivity and depth, as measured in data bits. "22 bits is excellent."
"Landscape" - Seems to be a rating of dynamic range, as measured in Ev. But, no floor is given.
"Sports" - Seems to be a signal to noise rating, which describes the highest ISO value they consider acceptable with the camera.
The numbers in the circles are the real mystery, though. As is, the overal score, which is some sort of weighted average.
10-08-2017 08:34 PM
"I do think though there is likely some utility in their comparison ..."
I have to agree to a degree. But I would never take them for a decision to buy or not to buy. But I don't believe very much of what I read. Seeing is believing. One reason I do what I do at least that's my excuse.
10-09-2017 11:14 AM
DxO is very vague about "how" they perform tests and what they share are "scores" -- typically they do not share actual data (although I have heard of a few exceptions).
A couple of years back, one of the photo magazines did an article in which he was invited in to DxO labs and got to see how they perform their tests. He wasn't allowed to write about everything (they controlled what he could disclose) but he did mention a few eye-openers and, at one point, he asked them about a practice that he felt was unethical.
The particular case (what he felt was unethical) is how they performed their ISO tests. If I recall the story correctly, they do everything to capture the images at each ISO so they can analyze the data. But instead of analyzing the data... they "resample" the image down to some common resolution (rather than running the analysis on the native untouched file straight-out-of-the-camera).
It turns out that smaller photo-sites tend to be nosier than large photo-sites. This puts high-res cameras at a bit of a disadvantage. DxO apparently (according to this author) resamples the images from higher-res cameras to reduce them down to be the same as lower res cameras. But if you do this, you automatically reduce noise. They were then posting the "scores" from the resampled images, but not showing the actual images, and also not disclosing that they had resampled the images.
The author knew enough to recognize that this was a bit of a cheat and felt that while there was the ethical dilemma of whether or not it's proper or fair to do it... there's another ethical issue that they don't disclose that they do this.
I first recognized something was funny when I saw DxO posting better scores for some cameras... then reading other reviews (typically Pop Photo and DPReview) and noticing them giving a completely different review. (e.g. DxO "scores" would show camera X has less noise than camera Y... and Pop Photo or DP Review would show camera Y had less noise than camera X.) But Pop Photo and DP Review always show their sample images (at full res) so you can zoom in and see for yourself.
I had someone going on (in another forum) about such a comparison from DxO... so I pulled up the sample images from DP Review and showed him the actual data - real images that you can see with your own eyes - showed the opposite.
This is when I started to wonder about DxO's testing methods... and it was a bit later that I ran into the article that peeled back a bit of the veil of secrecy and mentioned some of the things DxO does.
I've since seen this happen time and again.
I've also seen DxO give their "dynamic range" scores. PopPhoto did dynamic range testing too... except PopPhoto did it for each ISO and you could see where camera 'a' did slightly better than camera 'b' at low ISO... and as you changed ISOs the two cameras would trade places and suddenly camera 'b' has more dynamic range (DxO doesn't show us the data ... just the score... I don't think you derive this from DxO reports.)
I've seen enough going on with DxO that I no longer trust them.
10-09-2017 11:56 AM
Lenses...this was going to be my next question.
Want to finish reading the remaining responses to this thread...and my other..before asking some questions.
Would like to thank all of those who have responded.