How to best set up 1 computer & 2 video cards for 2 screens?
November 2, 2008 7:41 PM   Subscribe

I have a computer with built-in video (GMA 3100) [1 D-sub] and also just bought a $20 video card (Radeon X1300) [1 D-sub, 1 DVI]. I'm connecting both my LCD monitor and Projector to the computer. Will I be able to use both video cards at once and show different things? Which of the two D-subs should I connect to? Which screen should get the better (DVI) connection?
posted by lpctstr; to Computers & Internet (6 answers total)
Assuming you're on a Mac, you can use these instructions to open the Display preferences.

If you have an LCD panel plugged into one video card and the LCD projector plugged into the other card, there will be two display preference windows. Just select the window you want, in order to adjust settings for a particular display.

You can also set the display layout here, i.e. how the displays are laid out — if you want them as separate screens, you can put one to the right of the other — or you can set up display mirroring.

The display you should connect to the DVI connection would be the display with the worse analog-to-digital converter. One way to find this out is just test them.
posted by Blazecock Pileon at 8:54 PM on November 2, 2008

You should be able to get three independent pictures out of the three available connectors, and you should be able to extend your desktop across all three if you want. All modern operating systems support this kind of thing, though they have different ways of achieving it. What are you running?

If you're on standard PC hardware, and you visit your BIOS settings and turn off the integrated (GMA 3100) graphics entirely, you will get a bit more system RAM and your RAM will also run marginally faster. Dunno if Macs have some similar option.
posted by flabdablet at 1:45 AM on November 3, 2008

Oh, and use the DVI connector with the device that has the higher resolution. Unless, of course, things look better to you the other way around.
posted by flabdablet at 1:46 AM on November 3, 2008

Best answer: Don't use the built-in video. Plug the higher resolution screen into the DVI, the other into the VGA. Alternatively, if you've got a long run to the projector, get a high quality DVI / HDMI cable and use the digital to go the long run.

Both WinXP / Vista / OS X play happily with multiple monitors, different picture on each. (Though different wallpapers on each not so much without extra magic playing)
posted by defcom1 at 3:41 AM on November 3, 2008

Best answer: Depending upon your motherboard, the onboard intel chip may be disabled when you have the new graphics card installed (some designs share the same pcie or agp bus, so only one can be active). Either way, the radeon x1300 is significantly better in every department. If it doesn't do so automatically, you may need to turn onboard graphics off in the BIOS to have the radeon as the primary display device.

I would connect the LCD monitor to the DVI on the radeon - you're sitting a lot closer to it as a rule, and the digital connection will reduce ghosting and noise, which can affect the clarity of text. The VGA lead on the radeon will be fine for the projector (assuming you're doing large font presentations or video playback), and it's a lot easier to get a long VGA lead than a long DVI lead.
posted by ArkhanJG at 5:07 AM on November 3, 2008

Chiming in to 2nd ArkhanJG. I've really only worked with HP/Compaq and Dells, but from my experience you wont be able to use onboard and a PCI card simultaneously. Just use the x1300 with Ark's advice.
posted by Kupo? at 8:47 AM on November 3, 2008

« Older Help Me Find A Book On Why Groups Can Be Bad   |   Help me become a web savvy mogul superstar! (on a... Newer »
This thread is closed to new comments.