18th August 2003, 09:46 PM
Recently, I have been trying to take some still captures of images projected onto a wall from a video projector. The endeavor is to document, as a consumer, various aspects of a given projector.
Some of the things I am trying to capture (using photography) include, but are not limited to, the following:
Scene (Instataneous) Contrast
Vertical (V) Resolution
Horizontal (H) Resolution
Heat (causing optical distortion) ... and so on.
My problem lies in the environment in which the projector and projected image occur in. The room has a considerable amount of light control, and thus the ambient light is nearly non-existent and well below the minimum light levels of the projector and projection.
My concern is for the accurate representation for what my eyes are seeing. I fully recognize that what my eyes see may be more or less than what someone else can see. But, its what I'm seeing that I am trying to document, albeit accurately.
Now, my next comments are not meant to place a particular technology in bad light, but more to simply recognize what I've experience through my documentation attempts. With that said, I have found that the two digital cameras I have (Kodak DC260 and DC215) they have a poor ability for low-light detection and capture. Furthermore, these two digital cameras seem incapable of capture an accurate exposure when using the camera's internal metering system.
What captures I have been able to produce are accomplished through a series of long-duration exposures ranging from 1.5 to 5 seconds. If I make a 4-5 second exposure to capture an absolute black level for the projector (i.e. the projector's ability to produce a 0-IRE full-field test pattern), I can immediately see something in much less time than the camera can capture via short-exposure. This tells me that my eyes are more sensitive to low-light levels than the digital cameras I own.
Also, if I take the same 4-5 second exposure of a movie seen projected, the exposure is all wrong, and typically requiring an exposure 1/3 as long. This somewhat tells me the dynamic range of my digital cameras has a strong limitation.
So, imagine a completely unlit, sealed projection room with black (or extremely dark-colored) walls and a projector projecting a full-field 0-IRE test pattern, but having its on faults comparable to a 10-15 IRE (or higher) result, and then trying to capture this photographically.
With conventional film, one can chose a film speed more sensitive to low-light levels (e.g. 800 ISO or faster) and make multiple exposures to zero in the dynamic range boundaies to best capture the best representation, while functionalizing against said determined exposure length for a projected image.
The final intent is to produce an image in digital form (whether direct or through film-to-digital conversion) which, when viewed on a calibrated monitor, best represents what my eyes are seeing and without manipulation (dodging or burning, etc.) prior to the end result.
Note: IRE is a measure of light, which ranges from zero (0) for no light (black), to 100 for full-light (white). NTSC, or SDTV, assumes a black is 7.5-IRE in the United States, but in Japan its 0-IRE for black. HDTV uses 0-IRE for black. I am trying to capture conditions in which the 'representation') projected image for 0-IRE is not being accurately projected, and in doing so trying to capture for the determination of what IRE the projected image best represents.
18th August 2003, 10:47 PM
You didn't phrase a question - i assume you are asking "how does one go about doing it?"
1. You are testing resolution. Your camera must completely out-resolve you test scene by at least double the resolution (Nyquist frequency). You need to calculate the maximum wall resolution, scale it down by factor of distance and sensor size. For example, you expect 1600x1200 pixels on a wall image 1.6m x 1.2m. That's .5 lp/mm. With a camera sensor size at 36x24 (mm), filling the frame exactly 100%, you will need [.5x(1600/36)x2)=] 44.4 lp/mm on the sensor. That's quite high. With smaller sensors, the resolution requirements scale accordingly. i think none of the current crop of non-fullframe dSLRs can meet this resolution standard, at least theoretically. You can do the math based on pixel density and sensor size for each model. The Canon 10D resolves at ~67 lp/mm, if i remember correctly.
2. You cannot use the camera's metering or white balance. In fact, not even autofocus. Anything which is 'auto' by the camera will screw up your results bcoz you do not know what the camera chose, and you cannot compare the outputs fairly. Fix and bracket exposure, from below your darkest to above your brightest. For example, fixing the aperture at f8, shoot from 1/2000 to 8s in one or half stop increments, generating images for 1/2000, 1/1000, 1/500, 1/250, 1/125, 1/60, 1/30, 1/15, 1/8, 1/4, .5, 1, 2, 4, 8. With WB set to a fixed value, or brackted for EACH exposure. Only then can you have a set of fair images to compare across projectors.
3. Same tripod height, same image distance, same lens, same focussing distance (manual) across all projectors.
4. That's how i would attempt this. If budget allows, i would also get either a Canon 1Ds or a Kodak 14n, along with a high-resolution 50mm prime. 50mm or longer, as 35mm or shorter lens may introduce their own distortions. You didn't mention you were testing for image distortion, though, so this may not be a factor.
Hope this makes sense.
Last edited by ST1100; 18th August 2003 at 10:51 PM.
21st August 2003, 04:18 AM
Sometimes I have a habit of providing background information and then completely forgetting the question.
My major concern is with the accurate exposure reporting that is seen by my eyes. I've reached a condition in which if I expose for the content-field everything else is misrepresented.
I do under the concept of bracketing. But, this does not overcome my above result. I smoetimes wonder if photography is not the best approach, and maybe video/film would be better.