Your all saying its plugged into monitors not 32" or bigger lcd tv's. A good quality tv will limit the reolution to suit the quality of the feed. minimum hd quality of 1280.
sorry if i pissed a few people off with this i didnt mean too but probably 90% or more of laptops have crappy bga's. so many people struggle to get a good picture and most people that ask me actually cant even full screen it.
they buy things like vga to s video cables thinking it will sort it and in actual fact its stays exactly the same.
lets see what happens when TC uses his VGA........
Some interesting facts below, obviously ripped from a webpage. It confirms what i have been saying, and more.
Taken from page http://www.ehow.com/facts_5028663_vga-resolution.html
VGA stands for Video Graphics Array. The most prolific video resolution and connection standard available, VGA can be found in computing, TVs, gaming and now wireless devices. It forms the basis for every resolution that followed, including so-called high definition, or HD.
1. IBM pioneered VGA in 1987, when it first saw use on its then-dominant IBM PC models. This was a major breakthrough, since VGA doubled the resolution available to computers and easily outpaced previous standards such as CGA and Hercules, which were low res and could only display a few colors at once.
2. The original VGA protocol called for a resolution of 640X480 pixels and a total of 256 available colors. Soon after, VGA resolutions went up to 800X600, becoming the base setting for Windows operating systems until the early 2000s. VGA refresh rates go up to 75Hz, making it suitable to analog displays such as CRT monitors.
3. It's important to distinguish between VGA video and VGA adapters, as these may refer to two different things. VGA video confers a resolution of no more than 800X600, while the VGA adapter is a generic concept that supports a host of resolutions much higher than the original specs.
4. VGA adapters can be easily spotted as square D shapes with 15 pins. Most of them come colored blue for easy identification, and correspond with blue connectors on the back of computers and TVs.
5. Over the course of the 1990s, IBM and other developers built on the base VGA definition to create higher resolutions--these resulted in the birth of XGA, XVGA and other "Super VGA" standards. Resolutions went up to 1920X1080 on the original VGA adapter, far exceeding the initial protocol. However, all these came to be collectively known as "VGA," leading to some confusion.
6. Debate arose as to whether VGA counts as HD. The answer also relates to the acronym's transition from specific standard to catch-all identifier. While the original VGA specifications of 640X480 and 800X600 do not count as HD, subsequent iterations do, as HD starts at 1280X780 pixel per frame