S-Video Cable

TRDCorolla1

Golden Master
Messages
12,592
Location
California
What's the deal with this little cable? Does anyone actually use this? Does it provide high definition output? I'm assuming it doesn't.
 
its an anolouge video link, and uses two separate signals to carry brightness and color and it doesn't carry audio. they don't have high bandwidth compared to other means of transferring images and so they aren't used for high-def purposes.

edit: well it can carry HD, just really not the best and wouldn't give as good as result.
 
That sucks. I know most of the newer video cards are all HD supported already. I love to stream that across somehow. I'm not 100% sure if my monitor is high definition. It supports DVI, but HDMI or true HD I believe is better.
 
well supporting DVI i don't think (don't quote me on this ;)) necessarily means it supports HD as standard its more just the way the signal is transmitted than the content itself. Of course DVI is just about the best way at the moment to transmit HD signals but all comes down to the GPU supporting HD and the monitor.
 
I've used S-Video to transmit computer to TV in the past. Component is taking over this now, but some older TV's can't support it.
 
well s-video is perfectly fine for using with "normal" tv signals but composite is better since it has different wires for different signals and is capable of carrying more signals
 
Most standard LCD monitors only support DVI at its best so I don't think we're experiencing true HD yet. Until they come out with HDMI monitors and graphics card, we're stuck with DVI seems like. I think Vista itself supports HDMI, but hardware is limited.
 
The only reason you would ever use S-Video nowadays is to transmit from a video card that has it to an old standard definition TV.

Standard definition is 480p, or a 720x480 resolution.
High definition is anything above 480p. Your monitor is (most likely) high definition. The 2 standard resolutions for TVs are 1280x720 and 1920x1080, commonly referred to as simply 720 or 1080.
Then you get into progressive or interlacing scans. For example, 1080p is progressive and 1080i is interlacing. Basically, progressive is better, as the image will be clearer.

So then there are a few ways of hooking up your display:
- VGA/Component cables - Analog signal. Causes a loss of quality because the analog signal has to be converted into a digital signal to be displayed on the TV.
- DVI/HDMI - Digital signal. Best quality since no signal conversion needs to be done.
- Composite/S-Video - Worst quality. Can't carry a high definition signal, making it essentially useless for use with a high definition display.
 
Ok, your computer monitor is not HD, its actually XHD or Xtreme High def. Read that on Nvidias website. Think about it, your HD tv, is generaly 720p which is 13**x720 I think. Some are 1080p, which is 1920x1080 or something. A 1080p is usually 50" or so, and 720p from 30"+ on average. My computer monitor is 20.1" The highest resolution is 1680x1050, so technicaly, thats not quite as big as a 1080p, but the pixels are packed into a much smaller area, for smaller pixels, and there for extremly clear resolution.
Understand what I am saying? So really, the resolution of a computer monitor is much higher than HD.
 
Back
Top Bottom