Future sensors


I've use Canon 400D... to 40D... to 7D....

My own opinion is that at ISO 1600... 40D is better than 400D and 7D is better than 40D... and I agree that while TS is asking about sensor... lets not forget that the DIGI image processer is part of the system which we get to see the image.
 

diver-hloc said:
I've use Canon 400D... to 40D... to 7D....

My own opinion is that at ISO 1600... 40D is better than 400D and 7D is better than 40D... and I agree that while TS is asking about sensor... lets not forget that the DIGI image processer is part of the system which we get to see the image.

Yes the DSP in each camera plays an important part in delivering good quality images too. While good lens and receptive sensors are paramount to good quality images, having a fast and feature rich processor is not any less important.

If the processor is fast and good, it means more computing time allowed for higher quality noise reduction and also much higher complexity algorithm used to enhance the image. Since compression is used heavily in jpeg compression and also lossy raw, if the input image is better, it means less details lose due to unnecessary space used to store noise.

Not to mention better processor can be in terms of lower energy consumption and hence lighter body and more battery life. These are also important factors for a lot of users.

In return with more in camera processing capabilities, it adds up to be leaner workflow for professionals.
 

if we are looking at loosing noise, why limit the factor to just sensor?
I'm sure improvements to the processor will yield better results...

But how about a completely new camera?
a 3CCD/CMOS FF DSLR. Is possible?

Current sensors sense RGB on 3 different pixels, and join the values. (using a chromatic filter put infront of the sensor and such...)
So it needs to guess the other values of RGB where the pixel is not collecting (Pixel R needs to guess G &B values) based on surrounding values.
The higher the sensitivity, the bigger the effect of a bad guess. And that's noise in a high ISO image.
(this is what i understand in sensor engineering, do correct me if im wrong)

So if someone uses 3 CCDs/CMOSes (borrowed from 3CCD videocams), one for each colour (RGB) then there is very little guessing involved, reducing noise by leaps.
 

Last edited:
IsenGrim said:
if we are looking at loosing noise, why limit the factor to just sensor?
I'm sure improvements to the processor will yield better results...

But how about a completely new camera?
a 3CCD/CMOS FF DSLR. Is possible?

Current sensors sense RGB on 3 different pixels, and join the values. (using a chromatic filter put infront of the sensor and such...)
So it needs to guess the other values of RGB where the pixel is not collecting (Pixel R needs to guess G &B values) based on surrounding values.
The higher the sensitivity, the bigger the effect of a bad guess. And that's noise in a high ISO image.
(this is what i understand in sensor engineering, do correct me if im wrong)

So if someone uses 3 CCDs/CMOSes (borrowed from 3CCD videocams), one for each colour (RGB) then there is very little guessing involved, reducing noise by leaps.

Noise reduction is a choice made where reality signal comprise of noise. If the technology allows for better receptors that have high gain and good SNR, this is always the better choice to go. Given that noise will always be present in any signal, even digital, just that digital have steps and error correction to resolve these issues, a better processor helps to alleviate on these issues.

We are already using CMOS FF today, isn't it? It seems CMOS is chosen over CCD nowadays. I am not too sure about 3CCD except it is used in video cameras previously?

It seems for a long time, we didn't see technology moving towards stacking up sensors vertically, but it could be worse than having them separately when it comes to quality. Not in this field, so can't comment much. Generally having higher sensitivity is like boosting a trumpet. The more you amplify the signal, the higher gain, but at the same time, noise are also amplified, it is not really a win win situation. Better technologies nowadays can give better SnR, and I think this is the factor when you want to mention about how good a sensor is. Of course there are other factors than just SNR, such as how accurately and uniform colors are replicated for, the real scene.

Interpolation to transform from RGB array to pixel is not the real reason of noise. Each color receptor suffer from the amount of noise introduced, so there is really no much diff using 3CCD or other technologies. It wouldn't give u better noise control. Besides regardless which method u use, they have color filter in front to give you that RGB notion. You might wanna look at 3CCD, you can easily end up with a much heavier camera and also larger because of the need of a prism and also more sensor.

You might wanna read this
http://hdtv.videotechnology.com/HDTV-CMOSvsCCD.htm
 

Last edited:
Honestly speaking, I don't see much problem with the noise level of modern cameras. Personally, I consider anything 400ISO or less as "noiseless". Anything higher (I've went up to 3200ISO so far) seems to be easily compensated during post processing using programs like Camera Raw (assuming you shoot in raw). Of course, you lose fine details but I always see that as a normal. I don't see much situation requiring anything higher than 3200ISO (with aperture 2.8 or larger and a FF body). Even when that situation arises, I can compensate using longer exposure time + tripod. So honestly, I don't see the problem here.

On topic: I think you won't see significant advancement in noise reduction technology until some breakthrough happens which nobody can predict. I think it may be a better choice to just make do with what you have.