True Sharpness vs Fake Sharpness article


Status
Not open for further replies.

theveed

New Member
Apr 20, 2007
1,084
0
0
42
Singapore, Singapore, Singapor
#1
Excerpt:

True detail vs. fake sharpness
13th December 2007, 12:41 pm

It’s the same on every web forum - if you post a digital picture which would be acceptable to a photo library or professional buyer, half a dozen grumpy one-liners will come out saying ‘That don’t look sharp to me’ or ‘there must be something wrong with your XXX’ (fill in D300, A700, E-3, D3, 40D as required). Then someone posts a hugely messed up image and people say ‘Wow! What sharpness!’…


What is happening here? Why do some people know how to see fine detail rendering in an unprocessed image, while others do not see something as sharp until you can count the jaggies on every hair and see a halo round every white to black edge transition? There are several factors. One may simply be the type of screen used to view the image. We use Apple Mac Cinema screens throughout - two 24 inch, three 20 inch. They have the same technology as the LG branded screens used by Alamy to check image submissions. They are very neutral, adding no sharpening of their own, and very crisp with no anti-aliasing of images. The Mac system permits anti-aliasing of text and offers four levels of softening, which you can pick to suit various types of LCD and CRT screen. But some graphics cards also anti-alias images, and some screens - especially CRT monitors - are inherently very soft. When we used CRT monitors, we always specified Mitsubishi Diamondtron because they gave the most accurate pixel for pixel view of digital images.

Read the rest of the article...

Which is sharper to you?



or

 

theveed

New Member
Apr 20, 2007
1,084
0
0
42
Singapore, Singapore, Singapor
#5
That's what the article is pointing out and how the loss of resolution and detail is generally considered as "sharp" for non-print users
 

night86mare

Deregistered
Aug 25, 2006
25,541
0
0
www.pbase.com
#6
now this is weird, i like everything except for the eyeball area (white and blue) for the first

and i only like the eyeball area for the second - you can't deny that it looks sharper. edges are more defined for the details in the er, iris (right?)

mmmm, layer blending ftw :D
 

theRBK

Senior Member
May 16, 2005
2,048
1
0
#7
that's because the second image is not sharper... it has higher acutance... acutance is a measure of edge contrast rather than visibility of detail (sharpness)... the article's author seems not to differentiate the two...
 

dreamzcape

New Member
Aug 22, 2007
257
0
0
#8
refreshing article to read, thanks theveed.
 

Sep 15, 2003
1,590
0
0
Singapore
chester.sg
#9
Great article to clarify some misconceptions.

But at the end of the day, viewers love sharp-looking images. And even though it means over-sharpened fakies, that's alright since it's only for web viewing. I do agree that one should not over-sharpen the hi-res image and ruin the details. Having said that, the article did mention how different printers apply sharpening. So it takes an experienced person to know when and how much sharpness to apply.
 

Viewpoint

New Member
Jan 5, 2007
348
0
0
#12
that's because the second image is not sharper... it has higher acutance... acutance is a measure of edge contrast rather than visibility of detail (sharpness)... the article's author seems not to differentiate the two...
To be fair, i think he did, altho in passing. Somewhere in the middle of the article, Quote: "Sharpening like this prints well but would be quite wrong for web display, ... the Radius setting determines how coarse the acutance line effect looks, and how contrasts are altered." Unquote.

His point about not messing with reality is worth remembering. I agree that the iris looks "nicer" in the second overly sharpened shot but it is not "real." The iris has lower contrast compared to the eyelashes and therefore should look less sharp. In the author's own words: "The detail on the eyeball (which deserves retouching carefully on a final image) is precisely focused, and having lower contrast but higher frequency (fineness of detail) gives a more accurate indicator of a properly sharp digital image."

Seems like some selective sharpening procedure will give better results...
 

theRBK

Senior Member
May 16, 2005
2,048
1
0
#13
To be fair, i think he did, altho in passing. Somewhere in the middle of the article, Quote: "Sharpening like this prints well but would be quite wrong for web display, ... the Radius setting determines how coarse the acutance line effect looks, and how contrasts are altered." Unquote.
unfortunately, by not making a clear distinction, the idea of sharpness (vs. acutance) still remains fudged up instead of getting clearer... :)

personally, I'm of the school of thought that what looks good is more important than what looks "real"... reality is a subjective thing in any case (as is what looks good, granted)... for me an image is an interpretation of the subject by the photographer/retoucher... which is why its an art rather than a science, subjective rather than objective (unless one is taking images for scientific record)... and why the photographer/retoucher is so important in the process and not a mere operator of equipment :)
 

Status
Not open for further replies.
Top Bottom