32 bit for hdr


Status
Not open for further replies.

vendeta

New Member
Sep 20, 2002
252
0
0
Visit site
this might sound stupid to others, but excuse me for my ignorance, how come all my raw files are in 16 bit only when i process them to jpgs? how do u convert them into 32 bit? i'm using an e500. :dunno:
 

is it? :dunno: never seems to be a problem. What software are you using?
 

If I remember correctly, Jpegs can only store 8 bits for colour data. 32 bits is for HDR and in most cameras, the sensor is unable to store that wide a dynamic range of data, hence it's normally 12 bits up-convert to 16 bits.

I don't see much point in converting 16 bit photos to 32 bit photos since you don't really gain much, and need more processing power / memory when editing the photos. If anyone knows what advantages it might bring (for non HDR purposes) please correct me.

Why do you want to convert your RAW files to 32 bits? Unless you are merging multiple exposures for a HDR image, in which Photoshop automates the process and outputs a 32 bit photo for editing.
 

i have an impression u r mentioning about the color depth per color channel. :bsmilie: 8 bits for common use & 16 bit depth for image manipulation.

also i have the impression that DSLR HDR images r nothing but layering 2 or more layers to get the exposure right for hi-lites & shadows. the other native HDR file i heard was from medium format digital backs (well, it comes with a 16 bit sensor to begin with).

did i get the question answered right? :sweat:
 

In CS2 if you do HDR via the "Merge to HDR" you get an 32bit/channel (not pixel) - ie 32x3 = 94 bit/pixel - intermediate file, which you have to downconvert to 16bit/channel (48bit/pixel) to manipulate in Photoshop, and further down to 8bit/channel, ie 24 bit/pixel to save as a JPEG (the standard for JPEG, ie you cannot have a 16bit/channel JPEG). I dont think there is a native out-of-cam 16bit/channel file output, as DSLRs output at most 12 bits/channel.
 

Status
Not open for further replies.