Dell's LCD 19" monitor - To go for Analog or DVI


Status
Not open for further replies.

Adzz

Senior Member
Jun 20, 2004
1,074
2
38
West area
Hi guys, when is comes to monitor, I'm pretty new to it. I'm planning to get a Dell Desktop, and they come with a 17" LCD Analog monitor. I thought of upgrading to a 19" one which will cost only 105 for analog or 233 if I opt for DVI instead.

My question is able will there be much difference b/w analog and DVI. I do not really want to pump 233 more, but 105 is pretty fine.

Does it really differ much in photo editing if I choose analog instead?

What is the main difference b/w analog and DVI and what is it used more?

Thanks folks, for this monitor, it will mainly be used for photo editing, so I thought of getting some views from here.

Cheers!
 

text will be sharper. how abt images? will there be a difference too?
 

LCD screens understand digital image, this is what DVI is all about.

When u use normal RGB out from your PC, you convert your digital signal to analog then convert back to digital reaching the display with the loss in picture quanlity it brings.

You avoid this using DVI out to your LCD. The results is better overall, Sharpness & image quality.
 

I would pay more particular attention on the screen resolution. I have a 15" screen running 1400 x 1050. Very good for photo editing.
 

Del_CtrlnoAlt said:
its a wide screen? how does it do 1400x1050?

hmm...very odd resolution.....15" too small for 1400 x 1050...probably u will need a magnify glass to read the text

15" should be 1024 x 768 natively

17" & 19" are 1280 x 1024

20" is 1600 x 1200

the smallest widescreen i ever see is 18.1"

for me i use 20" widescreen, which is 1680 x 1050

there are 24" widescreen that use 1900 x 1200...hope to get 2 of those one day :D

display1uu.png


right now i got a mixture of wide and non-wide screen, but i use the wide screen most of the time.
 

I uses analog for my Dell 2405fpw LCD and it displays text as crisp as DVI output.
 

ultrakyo said:
I uses analog for my Dell 2405fpw LCD and it displays text as crisp as DVI output.

which means something is wrong with your graphic card or LCD, that's why the DVI quality is not as good as it should be
 

Wai said:
which means something is wrong with your graphic card or LCD, that's why the DVI quality is not as good as it should be

maybe need new spectacles. joking only.
 

urban said:
maybe need new spectacles. joking only.

graphic card and cables does make a different when using high resolution like 1900 x 1200, some brands got poorer 2D due to quality of the components used, i do see the different with my perfect eyesight ;p

there is a bandwidth limit for analog cable to carry without loss of singal, when the cable is too long or shielding not good enough, you can see the 2D quality suffer. this is more prominent when u are using higher resolution LCD because the amount of data that need to carry is really huge. DVI does not have this problem and that's why it is usually found in higher resolution LCD.

Quote from tomshardware
http://graphics.tomshardware.com/graphic/20041129/tft_connection-01.html

Connecting a TFT monitor via VGA means sub-par performance. The graphics processor provides digital data, which is then converted into an analog signal, transferred to the monitor and converted back into digital for the processor. The result is an unnecessary loss of signal quality.
 

Wai said:
hmm...very odd resolution.....15" too small for 1400 x 1050...probably u will need a magnify glass to read the text

15" should be 1024 x 768 natively

17" & 19" are 1280 x 1024

20" is 1600 x 1200

the smallest widescreen i ever see is 18.1"

for me i use 20" widescreen, which is 1680 x 1050

there are 24" widescreen that use 1900 x 1200...hope to get 2 of those one day :D

display1uu.png


right now i got a mixture of wide and non-wide screen, but i use the wide screen most of the time.

My 15" thinkpad laptop has a high res screen SXGA. I love it. Because it is a laptop you look at it near so is not too bad.

I would like bigger screen with at least 1400 x 1050.
 

Wai said:
graphic card and cables does make a different when using high resolution like 1900 x 1200, some brands got poorer 2D due to quality of the components used, i do see the different with my perfect eyesight ;p

there is a bandwidth limit for analog cable to carry without loss of singal, when the cable is too long or shielding not good enough, you can see the 2D quality suffer. this is more prominent when u are using higher resolution LCD because the amount of data that need to carry is really huge. DVI does not have this problem and that's why it is usually found in higher resolution LCD.

Quote from tomshardware
http://graphics.tomshardware.com/graphic/20041129/tft_connection-01.html

Connecting a TFT monitor via VGA means sub-par performance. The graphics processor provides digital data, which is then converted into an analog signal, transferred to the monitor and converted back into digital for the processor. The result is an unnecessary loss of signal quality.

Ofcourse that DVI reign supreme in comparison to analog but what I am saying is that I still do get excellent quality through the analog. Unfortunately, the graphic which my LCD is hooked up to does not have DVI, so I have to stick with analog until my next upgrade. However, the result is still perfectly satisfying. Below is the macro of my LCD pixels using analog.

Click Here
 

I will encourage you to get a graphic card with DVI, else it will be a waste with your 24" LCD. Not only that, since your card does not have DVI, I suppose you are using onboard graphic or rather old cards, their 2D usually very bad at high resolution (unless u are using Matrox G400/450 which is the only old card that can product sharp image at 1600 x 1200)

Even a $50 9200SE with DVI out is good enough, if you think analog is sharp, wait till you see DVI...

Last time i connect both of my LCDs to a X800XT card that has 1 x analog and 1 x DVI connector. I can see the different in sharpness rightaway when you are using side-by-side...in the end I bought a 9200SE PCI just for the 2nd LCD.

ultrakyo said:
Ofcourse that DVI reign supreme in comparison to analog but what I am saying is that I still do get excellent quality through the analog. Unfortunately, the graphic which my LCD is hooked up to does not have DVI, so I have to stick with analog until my next upgrade. However, the result is still perfectly satisfying. Below is the macro of my LCD pixels using analog.

Click Here
 

I am using ATI AIW 9600 Pro. My mobo, Abit KR7A-RAID, only supports 4X AGP. I've tried upgrading my graphic card to AIW 9700 and 9800 series but my PC fails to boot up and that's why I am still using analog.

Wai said:
I will encourage you to get a graphic card with DVI, else it will be a waste with your 24" LCD. Not only that, since your card does not have DVI, I suppose you are using onboard graphic or rather old cards, their 2D usually very bad at high resolution (unless u are using Matrox G400/450 which is the only old card that can product sharp image at 1600 x 1200)

Even a $50 9200SE with DVI out is good enough, if you think analog is sharp, wait till you see DVI...

Last time i connect both of my LCDs to a X800XT card that has 1 x analog and 1 x DVI connector. I can see the different in sharpness rightaway when you are using side-by-side...in the end I bought a 9200SE PCI just for the 2nd LCD.
 

Adzz said:
I thought of upgrading to a 19" one which will cost only 105 for analog or 233 if I opt for DVI instead.

My question is able will there be much difference b/w analog and DVI. I do not really want to pump 233 more, but 105 is pretty fine.

Does it really differ much in photo editing if I choose analog instead?

What is the main difference b/w analog and DVI and what is it used more?
If I recall correctly the specs between the 2 Dell 19" LCD monitors are more than just the DVI option. Look out for better contrast ratio and USB2 ports on the more expensive monitor :D
 

I have 3 of the Dell 19" ultrasharps, one of them with DVI input, two of them with analog input (ie same monitor, different graphic cards). Very happy with this monitor.

The DVI input gives you very accurate colour. There is colour cast from analog input, depending on the card (but nothing that cannot be adjusted to your liking). The best tip I have is to use the "Auto-adjust" function on the monitor, if the output from your analog card is fuzzy. This is a button on the monitor itself. I cannot detect any significant difference in text sharpness between DVI and analog (but then I have not compared side to side).
 

This sounds stupid, but is the DVI connection for a LCD monitor the same as those from LCD/Plasma TV?? Meaning if my video card comes witha DVI output, can I use the DVI connection to connect it to the LCD/Plasma TV, instead of the normal PC input connection?? Are the DVI connector pins the same??
 

Status
Not open for further replies.