Several pictures to improve image resolution


Status
Not open for further replies.

skanskan

New Member
Feb 6, 2007
22
0
0
47
Hello.

Does anybody know (*) how to improve image resolution using several pictures?
I mean, the same way HDR combines several pictures (with different exposition) to get a final one with "more luminosity information" using more bits....... why not using pixel information from several sources to extrapolate new information, a higher resolution picture?

The only reason I see not to get additional data is if all the pictures are exactly (all pixels) the same.

(*) What software or method to use.


Regards
 

Are you tlking about making panorama's I belive ther are severl photo stiching softwares out there. Most of the time you would need to shoot with a tripod though.
 

Like, use 4 6MPixel images of the same scene to extrapolate to an uber 24MPixel image? So that I can resize down to 640x320 to post on Clubsnap? hmm... :)
 

Hello.

Does anybody know (*) how to improve image resolution using several pictures?
I mean, the same way HDR combines several pictures (with different exposition) to get a final one with "more luminosity information" using more bits....... why not using pixel information from several sources to extrapolate new information, a higher resolution picture?

The only reason I see not to get additional data is if all the pictures are exactly (all pixels) the same.

(*) What software or method to use.


Regards

you can use Photomatix or just File>Automate>Merge to HDR in PS CS2
 

Hello

raptor84 said:
Are you talking about making panorama's I belive ther are severl photo stiching softwares out there. Most of the time you would need to shoot with a tripod though.

No, no, I don't mean stitching photos one beside the other in order to build a wider panoramic one but using several pictures of the same area, and using the small differencies between them to extrapolate new data.
I know it sounds a little strange.



machiavellian said:
you can use Photomatix or just File>Automate>Merge to HDR in PS CS2
I guess you think I just want to make a HDR, or Can you also use this programs to get what I'm looking for?
 

I have heard of using several images to produce an image with less noise but haven't heard of using several to produce higher resolution...problem with resolution is probably with lens and imaging chip, and averaging of multiple images is probably unlikely to give more resolution...it the lens and chip can't "see" more resolution, more resolution can't be created...
 

Hello



No, no, I don't mean stitching photos one beside the other in order to build a wider panoramic one but using several pictures of the same area, and using the small differencies between them to extrapolate new data.
I know it sounds a little strange.
ain't that normal interpolation that can be done with photoshop?
 

Hello
I've been trying different programs and methods and I'm not sure about the results.
The problem is that my PC runs out of memory and I have to work just with small crops of the original picture.
The first thing I do is to double (or triple) the size of the image with any program (such as Photoshop) to get extra pixels to work with. These pixels don't provide extra information but will be filled later.
I repeat this process with many pictures.
After that, I align all the pictures and try to combine them in one. I've tried several methods: blending layers in photoshop (averaging), using some specialized software to get the mean of the pixels, and even a HDR software (doesn't work).
I have to try some other things. I get a better image but just a little bit better and the process is too long and uses a lot of memory. Some special soft should be used.
Does somebody happens to try it please let me know your opinion.
 

Hello
I've been trying different programs and methods and I'm not sure about the results.
The problem is that my PC runs out of memory and I have to work just with small crops of the original picture.
The first thing I do is to double (or triple) the size of the image with any program (such as Photoshop) to get extra pixels to work with. These pixels don't provide extra information but will be filled later.
I repeat this process with many pictures.
After that, I align all the pictures and try to combine them in one. I've tried several methods: blending layers in photoshop (averaging), using some specialized software to get the mean of the pixels, and even a HDR software (doesn't work).
I have to try some other things. I get a better image but just a little bit better and the process is too long and uses a lot of memory. Some special soft should be used.
Does somebody happens to try it please let me know your opinion.
don think this will work well...

try zooming in the next time to get more details per frame and do a stitch to get the initial picture you want (a wider picture of cos)
 

Oh, yes, sometimes I did it, but, just trying to find new ways.
 

The technology of merging several images of the same object taken from the same spot and angle to improve resolution is a common methode used among astro photographers. It does indeed increase the details and the sharpness because lenses normally have much higher resolution than the CCD can record. Each time the shutter is activated it is a bit of a random result that is recorded, which is not very obvious. There are no CCDs on the market that can record the full resolution of a good quality lens. If two or more images are blended the resolution of the actual image is increased. Even if you later reduce the size to show on CS there should be more details from a merged image compared to a single image.
 

OK, thanks a lot.
I was very happy thinking I could invent something new, but everything is already invented, almost.
At first view what they call stacking is similar to make a HDR picture, and they also speak about averaging and noise reduction but as different things, I'll study it.
 

Like, use 4 6MPixel images of the same scene to extrapolate to an uber 24MPixel image? So that I can resize down to 640x320 to post on Clubsnap? hmm... :)
Stacking is so difficult, plus it will not guarantee an increased resolution also.. Might as well instead of, say, using 50mm lens, use 100mm lens and shoot 4 images and stitch together to form the originally intended image lor..
 

Stacking is so difficult, plus it will not guarantee an increased resolution also.. Might as well instead of, say, using 50mm lens, use 100mm lens and shoot 4 images and stitch together to form the originally intended image lor..
If the subject is near the angle change will effect the end result in a negative way. If the subject is far, like the moon for example, than you'd need very expensive lens if you are going to doule the focal length to increase detail. Using 2x TC is not a solution since it degrades the image. So, in theory it works as you say also, but in real life it is not a very practical solution to double the focal length. Stacking should not be difficult if the focal length, angle and the position is not changed in between the shootings if you stack images of identical sizes and don't start cropping and cutting part of the images.
 

If the subject is near the angle change will effect the end result in a negative way. If the subject is far, like the moon for example, than you'd need very expensive lens if you are going to doule the focal length to increase detail. Using 2x TC is not a solution since it degrades the image. So, in theory it works as you say also, but in real life it is not a very practical solution to double the focal length. Stacking should not be difficult if the focal length, angle and the position is not changed in between the shootings if you stack images of identical sizes and don't start cropping and cutting part of the images.
But if nothing changes, then stacking the images will only serve to reduce noise or give you a better dynamic accuracy, but it will not give you a higher spatial resolution because the same object is still captured by the same pixels on the sensor, and that sensor will not give you the in-between information required to give you a higher resolution.
 

But if nothing changes, then stacking the images will only serve to reduce noise or give you a better dynamic accuracy, but it will not give you a higher spatial resolution because the same object is still captured by the same pixels on the sensor, and that sensor will not give you the in-between information required to give you a higher resolution.

on a side note
i read b4 abt a method to reduce sensor noise of very noisy webcam photos
first you take a few black photos to map the sensor noise then average it out and use this as a template to remove the noise from your pics..
 

on a side note
i read b4 abt a method to reduce sensor noise of very noisy webcam photos
first you take a few black photos to map the sensor noise then average it out and use this as a template to remove the noise from your pics..

:thumbsup: idea.
 

Status
Not open for further replies.