True detail vs. fake sharpness
13th December 2007, 12:41 pm
It’s the same on every web forum - if you post a digital picture which would be acceptable to a photo library or professional buyer, half a dozen grumpy one-liners will come out saying ‘That don’t look sharp to me’ or ‘there must be something wrong with your XXX’ (fill in D300, A700, E-3, D3, 40D as required). Then someone posts a hugely messed up image and people say ‘Wow! What sharpness!’…
What is happening here? Why do some people know how to see fine detail rendering in an unprocessed image, while others do not see something as sharp until you can count the jaggies on every hair and see a halo round every white to black edge transition? There are several factors. One may simply be the type of screen used to view the image. We use Apple Mac Cinema screens throughout - two 24 inch, three 20 inch. They have the same technology as the LG branded screens used by Alamy to check image submissions. They are very neutral, adding no sharpening of their own, and very crisp with no anti-aliasing of images. The Mac system permits anti-aliasing of text and offers four levels of softening, which you can pick to suit various types of LCD and CRT screen. But some graphics cards also anti-alias images, and some screens - especially CRT monitors - are inherently very soft. When we used CRT monitors, we always specified Mitsubishi Diamondtron because they gave the most accurate pixel for pixel view of digital images.
Read the rest of the article...
Which is sharper to you?