How to use an hdmi cable to connect my pc to my monitor
August 3, 2013 6:13 PM   Subscribe

So I am currently using a VGA cable but would like to use an hdmi cable instead since I figure it would result in better quality. But apparently you can't just plug and play the hdmi cable as with the VGA cable???

So I thought I would just connect my hdmi cable from my casing to my monitor and that would be it. But when I remove my VGA cable my monitor shows "Check signal ..." and when I plug in the hdmi it's still the same.

1. So I did some research and apparently you need to be on 60Hz and not on 59Hz for hdmi? Is that true? So I tried to change that in my windows 8 by right clicking desktop -> screen resolution -> advanced settings -> monitor -> changed to 60Hz then click apply. But it keeps returning to 59???

But when I go into Catalyst Control Center it says that I am at 60Hz??? (And when I change it to 59 Hz in CCC it goes back to 60...) And then when I use Piriform's Speccy it says that I am at 59Hz? So I don't even know at what frequency I am.

2. Also there are 2 hdmi ports in my PC: the one from my motherboard and the one from my graphics card, I tried both, no success. And which one should I use if I manage to make it work?

Any ideas? Thanks!

PS: The problem is not the cable since I have used it to connect to my TV and it works fine.
posted by iliketothinknu to Computers & Internet (6 answers total) 2 users marked this as a favorite
 
Often times monitors will not auto-detect the input change, so you have to manually switch them VGA to HDMI. Is there a source/input button on your monitor?
posted by zinon at 6:20 PM on August 3, 2013 [2 favorites]


Try turning the monitor off and on with the HDMI cable connected. Windows may recognize it as a new device once it sees the EDID information coming over the HDMI as the monitor turns on, and the monitor should look for a signal on all of its ports when it turns on.

Which output port on your computer is primary will depend on your specific computer (motherboard or graphics card, depending on if you're on integrated graphics or not).

And the refresh rate is not something you should have to mess with. The monitor will include the correct refresh rate in the info it sends over the HDMI cable to the computer, and CCC should pick it up. What AMD/ATI chipset or GPU are you on?
posted by snuffleupagus at 8:13 PM on August 3, 2013


(And try disconnecting the power to the monitor, and reconnecting it after a little while if the power button doesn't work. Sometimes you need that harder power cycle for the monitor to trigger detection).
posted by snuffleupagus at 8:18 PM on August 3, 2013


I've had a situation before where a PC will refuse to recognize a video output port until it's rebooted with that port engaged. So try shutting down, plugging cables in the way you'd like them to be, and then turning your computer back on.
posted by haykinson at 10:15 PM on August 3, 2013


You should probably be using DVI instead of HDMI. Your video card might not recognize HDMI as a "primary" display. You might also be running into issues with HDCP (video copy protection) -- HDMI typically has this, DVI typically doesn't.

Regarding question #2, you should be using the ports on your video card, not the ones on your motherboard (your video card overrides it so they're likely disabled).

If you insist on using HDMI, you might get better results with a DVI->HDMI adapter (looks like this). If you have an aftermarket video card it probably came with one of these, or else you can get one for a few bucks. It's a passive adapter because HDMI and DVI are (mostly) the same digital signal.

(A DVI port looks like this, on the left. Your video card probably has one.)
posted by neckro23 at 2:21 AM on August 4, 2013


I must respectfully disagree with neckro23.

HDMI is literally the same as DVI (digital) with an added audio connection and support for HDCP (that's why the adapters work, they're just changing the connector). HDCP is not an issue unless you're playing back protected content, but HDCP won't work through an adapter. If anything, you introduce problems with HDCP by using HDMI/DVI adapter.

What’s the Difference Between HDMI and DVI? Which is Better?
DVI is one of the most common digital video cables you’ll see on desktops and LCD monitors today. It’s the most similar to VGA connectors, with up to 24 pins and support for analog as well as digital video. DVI can stream up to 1920×1200 HD video, or with dual-link DVI connectors you can support up to 2560×1600 pixels. Some DVI cables or ports may include fewer pins if they are designed for lower resolution devices, so you’ll need to watch for this. If your port contains all the pins, however, it can support the max resolution with no problem. The biggest problem with DVI is that it doesn’t support HDCP encryption by default, so if your hardware only includes DVI ports, you may not be able to playback full HD Blu-rays and other HD content.

You can connect DVI to an HDMI port on a newer monitor with a small digital convertor. However, since DVI doesn’t support audio, you’ll need to use a separate cable for audio when connecting to an HDMI port.This makes DVI one of the more versatile newer connectors. It’s both backwards and forward compatible, though at the loss of some convenience. You can also connect an older monitor that only includes a VGA port with a DVI port easily via a similar DVI to VGA converter if your video output supports analog video.


And an adapter won't usually solve a detection problem. DVI and HDMI monitors are detected the same way. The computer sees the EDID information coming from the display. Going DVI--->HDMI with an adapter only helps if the computer is flubbing handling detection over one port but not the other, usually due to some wonky problem on the computer side.

There's an unusual situation, where a display doesn't signal it can handle the same resolutions over HDMI as DVI, i.e. PC display resolutions aren't included in the EDID sent from the HDMI port. This guide discusses that issue, but it really is unusual (except sometimes in panels designed as HD televisions used as monitors) especially given how long HDMI has been out now. And an adapter won't usually solve that problem (or didn't when I've encountered it) rather it calls for a configurable HDMI switch that will suppress the monitors EDID in favor of a custom EDID, or a software override.

Finally, we don't know if the OP has an onboard chipset or a discrete GPU or both, and if he does have both whether or not the onboard video is also active is often controlled by the BIOS settings.
posted by snuffleupagus at 7:01 AM on August 4, 2013 [1 favorite]


« Older What other exercises can I do with this pole with...   |   Shipping to rural Alaska Newer »
This thread is closed to new comments.