Join 3,426 readers in helping fund MetaFilter (Hide)


graphics card outputs
February 26, 2012 9:06 AM   Subscribe

I own two monitors with twine inputs - HDMI and VGA. Why can't I find a graphics card with twin HDMI outputs? Are HDMI connections inferior? Or expensive? Or uncommon? Or splittable like USB? Or...?
posted by sodium lights the horizon to Computers & Internet (10 answers total)
 
It's mostly compatible with DVI-D, but it won't support the higher resolutions (e.g. 2560x1440) possible with dual-link DVI or DisplayPort, so manufacturers tend to offer more of those ports.
posted by Monday, stony Monday at 9:10 AM on February 26, 2012


Easy problem to solve, though. Just get a card w/DVI and HDMI, and then get a DVI-HDMI adapter for the DVI port. Cheap, simple, done.
posted by fake at 9:12 AM on February 26, 2012


Oh, the monitors came with DVI->HDMI convertors. But since I'm looking for a new graphics card, I wondered why they all had multiple DVI-D ports but just one HDMI...
posted by sodium lights the horizon at 9:14 AM on February 26, 2012


It's not that there isn't any of them, they're just considerably rare
posted by deezil at 9:45 AM on February 26, 2012


Normally HDMI is only used for driving a TV, so no reason to run more than one.
posted by Z303 at 9:47 AM on February 26, 2012


Everyone's posted some pretty good thoughts so far. I'll throw in a few more cents to it as well. HDMI certainly isn't inferior , but deezil is right in the sense that for most of the cards out there (even the higher-end ones) they just don't have 2 HDMI ports.
posted by isoman2kx at 9:50 AM on February 26, 2012


Depending on the monitor in question, don't immediately dismiss the VGA port.

Cheap flat panel monitors will have cheap ADC chips on their VGA port, resulting in a noisy signal. Better quality monitors will have better quality ADC chips. The ADCs on my Dell 2405FPWs are good enough that I can't tell whether it is being driven by the DVI or VGA port. At 1920x1200!
posted by b1tr0t at 10:27 AM on February 26, 2012


If you're running 1080p-equivalent monitors (1920x1080), it doesn't matter at all. MSM has it right: DVI is more flexible and expandable (unless you're running sound over the same cable).
posted by supercres at 10:53 AM on February 26, 2012


Let me try to summarize: if you're making cheap monitors, at 1980x1080 or less, it makes sense to have VGA + HDMI, since that'll make it easy to drive it from a computer or a home-theater receiver. If you're making expensive monitors at 2560x1440 or more, you probably want to have dual-link DVI or DisplayPort, because devices with HDMI v1.2 outputs won't be able to use the full resolution of your display.

If you're making video cards, DVI ports take up a lot of space, but one port can drive a monitor with a dual-link or single-link DVI input, a VGA input for your old monitor (with adapter) or an HDMI 1.2 input (with adapter) for your cheap monitor. Any large-resolution monitor is going to have either dual-link DVI or DisplayPort input. And you also have NIH: HDMI was established by consumer electronics companies, DisplayPort is standardized by VESA.

And I think that's where you get your current situation, where most cards go something like 1 or 2 DVI ports, 1 HDMI, and 0-2 miniDisplayPort.
posted by Monday, stony Monday at 11:42 AM on February 26, 2012


supercres: "unless you're running sound over the same cable"

Speaking of sound, using a DVI output is no impediment for modern NVidia cards. I don't know what sort of black magic they employ, but they transmit sound to devices at the other end of a DVI to HDMI cable. In light of that, there's not much reason to go out of your way to find a card with multiple HDMI ports.
posted by wierdo at 5:02 PM on February 26, 2012


« Older I own over 20% of my house. I...   |  Wordpress + Facebook = Fail. H... Newer »
This thread is closed to new comments.