Dual Monitor

No it will not. That will duplicate the same image to two monitors.

This 1ft VGA video splitter cable lets you connect two VGA monitors/displays to a single VGA video source

You need something like this to create a Virtual Video adapter and 'extend' to a separate monitor.

Note: Install the drivers on the CD before connecting the device to avoid problems. I've installed many of these, and always had trouble if you plug the device in before installing the CD software.

EDIT: Judging by your wording (and correct me if wrong), you're not in the IT dept or may not have one. Are these laptops that you're trying to gain an additional display on? or are they full blown desktop PC's?

The Desktop may have another port for video already (may just look different, e.g. DVI, HDMI, DisplayPort), and if you are using Laptops, then you can use that video out port to extend the screen as is without the use of an adapter.

I'm assuming that these are desktops with only a single VGA port though, is that correct?
 
Last edited:
Thank you for answering my question.

You are right, we do not have an IT department. We do contract out for some of our needs, but that tends to be for larger items like server maintainence. These are regular run of the mill desktops with only one VGA port. We were thinking about upgrading to either a better video card with a VGA and a DVI port or using a display adapter, like you mentioned. However, one of our employees insisted that the item in the link would work. I was doubtful, but I'm not a computer expert, so it was hard for me to argue against him, especially since he is a programmer.
 
Yea he was wrong on that one. It happens.

Depending on how much time you're wanting to invest in this endeavor determines which route you should go.

Installing new Video cards in each PC is going to involve a little more time in getting them up and running. You can get Video cards for around the same price as those adapters so it's not really a price issue. The extra time here is with installing the actual card. Both solutions require a software installation and a reboot, but physical cards have to be inserted.

The adapters will be the easier route in terms of ability needed. Install the software on the CD, reboot and plug in the adapter, connect monitor, and you're good to go.

An employee may try to convince you that you need the video cards because of graphics, picture quality bla bla bla. Unless you're involved in extremely detailed color correction or video production, you won't notice the difference between the adapter and a bazillion dollar video card. Office products or accounting software won't look any different to the average user.

Edit: As a side note....

it was hard for me to argue against him, especially since he is a programmer.

Programmers are the worst people to take computer advice from unless they have a background in hardware. No offense to any programmers here, but almost every programmer I've come across that started out as a programmer, knows surprisingly little about the rest of the computer... always blown me away.
 
Last edited:
Programmers are the worst people to take computer advice from unless they have a background in hardware. No offense to any programmers here, but almost every programmer I've come across that started out as a programmer, knows surprisingly little about the rest of the computer... always blown me away.

As a programmer who knows hardware, I'd have to agree. Most of my brethren suck at hardware. You'd be surprised at the large number of them that wouldn't have the slightest idea on how to debug their computer if something goes wrong.
 
Yea he was wrong on that one. It happens.

Depending on how much time you're wanting to invest in this endeavor determines which route you should go.

Installing new Video cards in each PC is going to involve a little more time in getting them up and running. You can get Video cards for around the same price as those adapters so it's not really a price issue. The extra time here is with installing the actual card. Both solutions require a software installation and a reboot, but physical cards have to be inserted.

The adapters will be the easier route in terms of ability needed. Install the software on the CD, reboot and plug in the adapter, connect monitor, and you're good to go.

An employee may try to convince you that you need the video cards because of graphics, picture quality bla bla bla. Unless you're involved in extremely detailed color correction or video production, you won't notice the difference between the adapter and a bazillion dollar video card. Office products or accounting software won't look any different to the average user.

Edit: As a side note....



Programmers are the worst people to take computer advice from unless they have a background in hardware. No offense to any programmers here, but almost every programmer I've come across that started out as a programmer, knows surprisingly little about the rest of the computer... always blown me away.


I'm leaning towards the adapters mainly because like you said, they're easier to install and we're only looking at spreadsheets. Graphics are not a huge deal. As well, I don't relish the idea of installing video cards on 12 computers.

It's funny, he's ordered it and now he's going to try and prove it works.
 
When he gets them all setup, right click on the desktop (assuming Windows 7) and go to Screen Resolution. He'll be able to see that there's only one display output detected.

When you get the actual adapters, use the same screen to setup the extended desktop.
 

Attachments

  • Capture.PNG
    Capture.PNG
    57.7 KB · Views: 2
Back
Top Bottom