How can I get my Sony HDTV to listen to my MacBook Pro?
December 6, 2006 7:36 AM   Subscribe

Please, help me understand DVI, HDCP, etc., so that I can use my HDTV as a display for my MacBook Pro.

I have a Sony HD Trinitron -- a KV-30HS510 -- that I'd really like to connect via DVI for use as a display for my MacBook Pro, but I have no idea what cables or converters I need to get -- there seem to be so many types of DVI! I also have a sneaking suspicion that Sony's copy-protection scheme, HDCP, may actually succeed at impeding my ability to do this very simple, non-piracy-motivated thing with this hardware that I ostensibly own, which seems too ridiculous to be true, but I'd like some independent confirmation. Thanks for your help, friends.
posted by Embryo to Technology (6 answers total)
 
HDCP is a scheme that prevents non-compliant display devices from showing content generated on compliant devices. That is to say, if your MacBook isn't using HDCP restrictions, then it doesn't matter what DVI-compliant device you're using to display the MacBook output. So don't worry.

The key thing to getting a good picture from computer output shown on a TV is some kind of resolution compatibility. If your computer won't output the native resolution of the TV, it's going to be the same as using any LCD in non-native resolution: jaggy and/or stretched.
posted by rxrfrx at 7:59 AM on December 6, 2006


You need to make sure your video card supports true HDCP output over DVI. Almost all of them do not, despite listing HDCP on their features.

Here's a thread I found with a quick search that should be useful for you:

http://www.hardforum.com/showthread.php?t=1071342
posted by empyrean at 8:02 AM on December 6, 2006


Ah, and rxrfrx addressed whether or not you'd need it at all. He's right, basically you'd only need HDCP ability if you were playing encoded hi-definition content (over Blu-Ray, HD-DVD, etc). But if you wanted to do that, you'd need the right video card! =)
posted by empyrean at 8:05 AM on December 6, 2006


What kind of inputs are on your TV?

HDCP is the protocol for transmitting that encrypted video data to your TV.

HDMI is the port that is on most HDTVs these days, and which HDCP works over. HDMI is similar to DVI, but carries the audio signal as well. Assuming you're going to an HDMI port, you'll want a DVI to HDMI cable.
posted by kableh at 8:33 AM on December 6, 2006


You just need an DVI to HDMI cable. There is exactly one type of these, so you don't need to worry about compatibility.

You're better off with a TV that has HDCP, because it means your TV can accept encrypted content in addition to unencrypted content. Having it doesn't add any restrictions.

(and it's not a Sony technology)
posted by cillit bang at 9:00 AM on December 6, 2006


Since your computer has a DVI out port, and your HDTV also has a DVI input, the only thing you need to use your TV as a monitor for your computer is a DVI Cable. Just DVI on one end, and DVI on the other.

You don't want a DVI to HDMI cable, because I don't think that specific TV has an HDMI input.
This is what i found for it's inputs.

"S-Video x 2 • Component x 2 • Composite x 3 • DVI x 1 • RF x 2 • Audio (RCA) x 6"

And outputs of MacBook Pro:

"DVI-out port for external display (VGA-out adapter included, Composite/S-Video out adapter sold separately);"

Also, you shouldn't need to worry about HDCP, because I highly doubt you're Mac Book is going to output anything that is using HDCP, and if it is, I doubt that the DVI-out port is HDCP compliant anyways so it won't matter. However, if you do plan on outputting anything using HDCP, why not just use the S-Video out from your computer to the TV?

Just get a regular DVI cable.

Good Luck.
posted by trueluk at 9:51 AM on December 6, 2006


« Older IS Angelina.ttf really a freeware font?   |   Statistically comparing different search engine... Newer »
This thread is closed to new comments.