Downsides to using an HTDV as a computer monitor?
December 28, 2008 3:53 AM   Subscribe

I'm thinking about getting an HDTV to use as a monitor for my computer, because the versatility and cost/size ratio seem much better. What are the disadvantages?

The only disadvantage I've heard is that TVs are much brighter than computer monitors, which would make it necessary to sit farther away to avoid eyestrain. I've also heard speculation that TVs distort colors to make the image appear artificially vivid. Does anyone know of any other drawbacks?

(In case it matters, I plan to do a fair amount of digital art and video.)
posted by ambulatorybird to Computers & Internet (17 answers total) 4 users marked this as a favorite
Best answer: I do this. There are a couple of things you will wnt to be aware of:

a) You really want either VGA or HDMI. Seriously. The component dongles on most video cards have this habit of sucking. I don't know why that is.

b) HDTV's are often VERY picky about the particular resolutions and dot clocks they take. Particulalr,y there is a set of HDTV defined ones, which often don't match what a computer sensibly outputs. Make sure you have a video card that can output the right ones - nvidia and ati should be okay, it required a magic modeline in linux and some tweaking in powerdesk on windows for me though!

c) When LCD's say 1320x768, they often really mean 1280x720 is the 1:1 pixel ratio input - this is a way that crappy monitor firmware tries to work around (b) by simulating a computer mode. Use the 1280x720 in that case. 1920x1080 sets seem to avoid a lot of this bogosity - but i've found that some don't take the native mode on the VGA input - beware! (but you probably have vga, hdmi, and component on your video card anyhow)

d) Officially, a HDTV should take a linear ramp at a gamma of 2.2/2.5/2.8, country depending*. Officially, your PC outputs sRGB (around 2.2ish), and a mac outputs 1.8. This required a magic setting on my monitor to set it to srgb colour mode. Not all have this. For video, this may well be what you want anyhow. For photoshop, you can recalibrate the colour. You will need to do this to make your stuff not look like ass on normal people's sets.

e) I've been using 720p, which I like, but my friends bitch tht they coudn't work at that res - my goal was to sit further from my screen, and it worked.. 1080p should be okay for everyone, tho. but just consider it before you buy, i guess!

* this differs officially for NTSC/PAL, but doens't usually.
posted by jaymzjulian at 4:27 AM on December 28, 2008

(the colour distortion complaints you mention above, btw, are point (d). Almost all HDTV's i've seen do have a colour mode setting, however. Also, yeh, you'll wnt to turn down the brightness, but that's true of watching TV on it too!)
posted by jaymzjulian at 4:29 AM on December 28, 2008

Why would you put a VGA signal (analog) into a digital monitor? Either use DVI or gods forbid the DRM crippled HDMI. This may require buying a new video card for the PC. You're probably SOL on most laptops.

OH and don't get confused between a HDTV Monitor and a real HDTV. TVs have a ATSC tuner that will receive over the air DTV signals.
posted by Gungho at 5:24 AM on December 28, 2008

I have a 42" Regza LCD as my main TV with an HTPC hooked up to it. I have a MSI Diamond Plush 8600GTS that has a native HDMI output. The HTPC has an avermedia PCI-E capture card. The picture looks fantastic. I use vista media center but mostly watch shows from Hulu.
posted by bleucube at 6:05 AM on December 28, 2008

Gungo: The reason you would put a VGA signal into a digital monitor, are actually pretty stupid - often the processing path of the VGA vs. the HDMI input is different, and you actually end up with a better _pc_ signal on the VGA input (think srgb vs. PAL/NTSC gamma, and often forced features such as deinterlacing and noise reduction - both killers if you're using it as a PC monitor). Additionally, if VGA is what you have, the VGA is what you have - honestly, as long as you aren't using Joe Shonky's $5 VGA cable, the output should be pixel perfect at native res/timings anyhow*

* Someone actually tested this for 1080p over both component and vga, btw, which was kind of an interesting test, but I am yet to find it.

bluecube: ironically, you want your TV to behave like a TV and not like a monitor then - that's a pretty different use case, since you absolutely want your colour and processing to be TV-like and not PC-like
posted by jaymzjulian at 6:15 AM on December 28, 2008

(this is not to say that I don't think that the DVI/HDMI inputs are not superior in a lot of cases - IF they behave "correctly", of course they're better. just.... well, tv manufacturers fuck this shit up a lot. which makes sense, since the tested use case is usually as a tv...)
posted by jaymzjulian at 6:17 AM on December 28, 2008

This is what I've said about it before. In short, you want 1:1 pixel mapping.
posted by i_am_a_Jedi at 6:41 AM on December 28, 2008

Best answer: I've also heard speculation that TVs distort colors to make the image appear artificially vivid.
This is true when you take the television out of the box and plug it in, but it is possible to calibrate the television for accurate NTSC color reproduction. Manufacturers jack up the brightness and sharpness settings so that they pop out when placed next to all of the other televisions on the display wall. You can either pay somebody to calibrate it or buy something like Digital Video Essentials and get it as close as you can. I used DVE and it produced a very noticeable difference on my television. Other people with nice televisions usually make remarks about how nice mine looks in comparison to theirs--I usually end up lending out the disc and they achieve the same results.
posted by TrialByMedia at 6:49 AM on December 28, 2008 [1 favorite]

honestly, as long as you aren't using Joe Shonky's $5 VGA cable

Don't get fooled into throwing money at uselessly over-engineered cables. At short distances up to 6ft (2 meters), you can pretty much get away with any cable.
posted by ellF at 8:33 AM on December 28, 2008 [1 favorite]

I've never tried this. But if I did, I'd make sure my HDTV has a "game mode" where it does a minimum of video processing. Reduces lag and artifacts: you don't want edge detection making echoes on the sides of all your text.
posted by Nelson at 8:37 AM on December 28, 2008

Nelson: Be careful of that, Game Mode on a lot of sets actually does MORE processing, such as lightening dark areas (think: a doom3 filter :)) and doing additional motion compensation which may not be desirable on a pc desktop.

elif: Yeah, I've only ever actually seen one dodgy VGA cable. But I thought it beared mentioning to preempt the inevitable "i had a broken vga cable, and it looked like shit!" arguments :))

TrialByMedia: DVE is awesome, but that's calibrating for a TV, not a monitor - right idea, wrong colour space. (I should find a digi cam so I can photograph the difference....). You're probably better off with the adobe calibrator or similar for calibrating for a PC.
posted by jaymzjulian at 8:50 AM on December 28, 2008

*(btw, I am aware that officially, HDTV is supposed to be sRGB. I've never, ever seen it implemented this way on a TV not designed as a computer monitor, and I've never, ever seen a TV station broadcast it in this format. This goes along with esparanto being the official language of the world, a great idea, but ignores the deployed reaity :))
posted by jaymzjulian at 8:52 AM on December 28, 2008

jaymzjulian: Sorry, didn't mean to quote you then follow it up with "fooled" - you're quite right, in that you want a reasonable cable. Spending more than $10-$15 is being unreasonable, though, which is what I really meant. :)
posted by ellF at 8:53 AM on December 28, 2008

I actually use a Samsung HDTV as my main monitor. It's in my living room. I get to lounge on the couch to code. It also makes it easy for the wife and I to watch streaming video together.

The only problem I have is that I sit at movie-watching distance from it. This means that I have to turn up the font size on everything in order to read comfortably.

While most of my operating system (Ubuntu) can handle that just fine, websites are consistently broken. It appears that people code their divs and such with constant sizes that do not adapt to my increased font size: lines of text get cut off, and I sometimes miss entire elements. The result is that I frequently have to turn off style sheets in order to see the text on the page. It's kind of annoying, but I mostly cope.

(Incidentally, I think webdevs are fucking idiots for writing code this way. On top of annoying me, it's inaccessible. I have to turn up the font size 'cause I sit too far away; other people have to turn it up because they can't see.)
posted by Netzapper at 11:41 AM on December 28, 2008

On webbrowsing and adding to what Netzapper said.
Since Firefox 3 came out with full page zoom (accessed by doing Ctrl +/-), try that. Firefox remembers your zoom levels by domain, and you can also edit the zoom levels in the about:config and just filter for zoom. You'll see a line of numbers separated by comma's... just throw more zoom levels in there if you want (2.5, 2.6, etc...).

I run at 1080p and set my Firefox font size up about 10-14 pts and then zoom the page the rest of the way for a proportionally scaled view.
posted by ijoyner at 12:33 PM on December 28, 2008

I've recently replaced my old monitor with an HDTV. The one problem I have with it is that it takes up a lot more room than the monitor, even though the screen is the same size. Admittedly this can probably be attributed partly to the particular makes I chose, but in general HDTVs still tend to be a lot bulkier than a similar monitor - the "border" around the screen tends to be wider, and the extra inputs they use tend to make them bulkier at the back.
posted by fearthehat at 12:36 PM on December 28, 2008

Response by poster: Thanks for the replies, everyone. I'll pay close attention to the supported resolutions and color settings.
posted by ambulatorybird at 5:13 PM on December 28, 2008

« Older YO IT'S LIKE EVERY DOG IN THE NEIGHBORHOOD IS...   |   Making Winamp's shuffle smarter Newer »
This thread is closed to new comments.