Should I use the video card from my old desktop computer in my new desktop? Or is it too much hassle and I should buy a new video card that supports two video outputs?
May 3, 2004 6:53 AM   Subscribe

Two video cards, one desktop: If I take a video card from an old desktop and install it in a new desktop how much BIOS/system customizing do I have to do? I'm purchasing a new desktop today - a cheapo eMachine and was hoping to get a second LCD panel as well. Or is it easiest to just buy a new video card that supports two video outputs (any recommendations)?
posted by ao4047 to Computers & Internet (6 answers total)
 
I've only used one video card, dual outputs (its wonderful), but here's a short blurb from TechTV that might get you started.
posted by shinynewnick at 7:05 AM on May 3, 2004


I once tried to install a second video card on my XP box, but no dice. Some problem with the BIOS only recognizing one card at a time, I fiddled with it for like half a day and gave up.
posted by signal at 7:17 AM on May 3, 2004


i have a geForce FX card, similar to this one. relatively inexpensive, has standard monitor out, digital out, and s-video out. one of those ought to give you a two-display setup, use a CRT monitor on the standard and a digital flatscreen on the DVI hookup. as long as your new machine has an 8x AGP slot you're good to go.
posted by caution live frogs at 7:17 AM on May 3, 2004


Response by poster: Thanks so far - totally forgot to check TechTV.
posted by ao4047 at 7:58 AM on May 3, 2004


If you're running XP it should be as easy as installing the old video card, hooking up both monitors, then going to the display settings, clicking on the second monitor that should appear and click extend my desktop.

I have a desktop running 3 monitors and that's pretty much the extent of any trouble I had to go through to get it to work.
posted by Mick at 9:15 AM on May 3, 2004


If you're planning to do this under Windows, there are a few minor caveats to be aware of:

You only have one AGP slot. Subsequent video cards will need to be installed in PCI slots.

Due to limitations of many Windows video drivers, some can not be used in either the primary or secondary role in a multi-monitor setup. This varies among versions of windows -- I believe the most options were available under Windows 98, and 2000 & XP offer relatively fewer hardware choices.

There are two different concepts of "primary" and "secondary" display:

In the BIOS, you'll most likely want to set the PCI card to "primary." Sometimes there's a BIOS Setup switch where you pick Primary Display Adapter. The answer in most cases is PCI. The BIOS and startup screens will display on this one. If you don't get this set, Windows will probably detect the PCI card, but refuse to use it.

In Windows, the concept of "primary" and "secondary" is completely separate and doesn't depend on what you set in the BIOS (apart from the detection/willingness-to-use bit). The primary display can be selected under the Settings part of the Display Settings control panel. The primary adapter will be the one used for hardware video overlay (like AVI/TV/DVD playback) and will be the one used by DirectX, OpenGL, and D3D. It will also contain the taskbar. The secondary display will just be extra screen space.

You'll want to tell Windows that your primary display is the AGP device with the better card in it.

Driver compatibility is so dodgy that it's generally a good idea just to use two nVidia or Matrox chipset cards. Everything else will require troubleshooting and fiddling around.

If you play games, you'll want to disable the secondary display. Most games get very, very confused when multiple monitors are enabled.
posted by majick at 11:44 AM on May 3, 2004


« Older In CSS, how do you remove the bottom border of a...   |   What causes things to vibrate? Newer »
This thread is closed to new comments.