DVI or VGA?

Anubis1980

Daemon Poster
Messages
1,308
Hi got my new Acer 22inch widescreen monitor delivered today :) woo lol, annyway i can finally use my gfx cards DVI output, Is there really much of a difference between DVI than normal VGA? Ive had to use cleartype so i can see the words more clearly for the first time on an LCD. I used to hate cleartype lol, wondering if perhaps DVI has made the words appear 'thinner'.
 
Yeah, DVI is better. Main reason is that it's digital and VGA is analog. Use the DVI since you have it, the quality will be noticably better.
 
I'm on a DVI monitor, but running off VGA. I'll have to get hold of a DVI cable and see if its any better.
At the moment, theres a lot of banding on colours in games, etc.
It may just be the monitors fault though...
 
I have to say it does look clearer over my older monitor. My old one wasnt bad tho a 19inch Acer LCD. I also have a VGA out on the monitor but its too much of apain to swap cables lol. Just wanted to know if DVI was a gimmick o r actually made a diff.
 
Anubis1980 said:
I have to say it does look clearer over my older monitor. My old one wasnt bad tho a 19inch Acer LCD. I also have a VGA out on the monitor but its too much of apain to swap cables lol. Just wanted to know if DVI was a gimmick o r actually made a diff.
Nah, I can notice a slight difference. I don't have the best eyes though.
 
Found this great explanation!! kinda answered my own question lol thought id share it though.


DVI is better than VGA for TFTs because they are digital and VGA is analog. A TFT displayes its picture digitally, pixel per pixel. Via DVI the panel gets data for each pixel, so the picture generated in the graphics card will match exact with the pixels on the panel itself.

Not so with VGA. First, the picture is generated digitally in the graphics card. Then it's converted to analog. In the TFT they will be converted again to digital (=> senseless twice conversion => quality loss), using the phase and the clock, and it'll be calculated which pixel should display what color. As the phase and clock can't be adjusted so precisely that a pixel of a picture generated by the graphics card will be displayed by the appropriate pixel on the panel. Means that the picture will be interpolated a little bit, which again means quality loss.

The electron cannons of the CRT need analog signals, that's why VGA is the best for CRTs and DVI would make no sense here.
For TFTs, DVI is the best. I won't get a TFT with no DVI.

So i guess DVI is the way to go for TFT's
 
Anubis1980 said:
Found this great explanation!! kinda answered my own question lol thought id share it though.


DVI is better than VGA for TFTs because they are digital and VGA is analog. A TFT displayes its picture digitally, pixel per pixel. Via DVI the panel gets data for each pixel, so the picture generated in the graphics card will match exact with the pixels on the panel itself.

Not so with VGA. First, the picture is generated digitally in the graphics card. Then it's converted to analog. In the TFT they will be converted again to digital (=> senseless twice conversion => quality loss), using the phase and the clock, and it'll be calculated which pixel should display what color. As the phase and clock can't be adjusted so precisely that a pixel of a picture generated by the graphics card will be displayed by the appropriate pixel on the panel. Means that the picture will be interpolated a little bit, which again means quality loss.

The electron cannons of the CRT need analog signals, that's why VGA is the best for CRTs and DVI would make no sense here.
For TFTs, DVI is the best. I won't get a TFT with no DVI.

So i guess DVI is the way to go for TFT's
Yeah, that's what Tommy said, just he didn't explain analog vs. digital.
 
oh yeah lmao sorry :rolleyes: Find it hard to focus on the monitor, its too big, dont know where to look or what to read ;)
 
Back
Top Bottom