How can I increase the screen resolution of the TV I'm using for a Monitor
February 28, 2011 7:39 PM
How might I resolve my resolution confusion?
OK Hive mind. I give up. I've been banging my head against this problem for hours and now I find myself, humbly, looking for guidance.
So I just got a nice LG 42" or so TV for free since a ton of them were donated to my work. I don't really watch a lot of TV aside from on my computer, so I decided to try and set it up as my monitor.
Here's the problem. I can't seem to get the resolution to go any higher than 1024x768. I feel that this shouldn't be the case.
I'm running Windows 7 x64. The graphics card is a (what I understand to be fairly decent) NVIDA GeForce 7900 GTX. I'm running that with a DVI-HDMI cable to the TV.
I tried installing the latest drivers for the graphics card. But even from its fancy-little control panel thingy, the highest resolution available is 1024x768 (native). (addendum: I know enough about what I'm doing to know that "native" indicates something important, but I'm not yet quite sure what)
To confound matters further, as I was fumbling with my connections like a 12 year old trying to undo a bra for the first time, there was a strange occurrence. When I switched the DVI plug from one input to the other (the graphics card seems to have two of them), the options on the screen changed. I was rewarded for my fumblings with a vast list of resolution options to play with. Briefly, I rejoiced.
However, to my frustration, most of those tantalizing resolution settings remained beyond my reach, as selecting them would only cause the screen to go black, and the words "no input" to dance merrily around my screen, mocking me. In the end, the only new settings I could use were 1152x768 and 1280x854, which struck me as a small improvement, although soured by the fact that everything was unpleasantly squished. I suspect that to be due to the fact that both of those resolutions are intended for a 3:2 aspect ratio thank you wikipedia whereas my TV can be set only to 16:9 or 4:3. I suppose I should have settled for one of them but I continued to poke and prod and fiddle and fumble with settings like a horse attempting hand surgery, and now both DVI inputs yield naught but the disheartening resolution of 1024x768 and it's vile lackey, 800x600.
If I can't figure this out, then I suppose I'll just procure a SNES or something to make the TV useful, and return to my more modestly sized monitor. But perhaps, with the help of the hive mind, my nerdy and over-sized ambitions can be realized.
Thanks kindly.
OK Hive mind. I give up. I've been banging my head against this problem for hours and now I find myself, humbly, looking for guidance.
So I just got a nice LG 42" or so TV for free since a ton of them were donated to my work. I don't really watch a lot of TV aside from on my computer, so I decided to try and set it up as my monitor.
Here's the problem. I can't seem to get the resolution to go any higher than 1024x768. I feel that this shouldn't be the case.
I'm running Windows 7 x64. The graphics card is a (what I understand to be fairly decent) NVIDA GeForce 7900 GTX. I'm running that with a DVI-HDMI cable to the TV.
I tried installing the latest drivers for the graphics card. But even from its fancy-little control panel thingy, the highest resolution available is 1024x768 (native). (addendum: I know enough about what I'm doing to know that "native" indicates something important, but I'm not yet quite sure what)
To confound matters further, as I was fumbling with my connections like a 12 year old trying to undo a bra for the first time, there was a strange occurrence. When I switched the DVI plug from one input to the other (the graphics card seems to have two of them), the options on the screen changed. I was rewarded for my fumblings with a vast list of resolution options to play with. Briefly, I rejoiced.
However, to my frustration, most of those tantalizing resolution settings remained beyond my reach, as selecting them would only cause the screen to go black, and the words "no input" to dance merrily around my screen, mocking me. In the end, the only new settings I could use were 1152x768 and 1280x854, which struck me as a small improvement, although soured by the fact that everything was unpleasantly squished. I suspect that to be due to the fact that both of those resolutions are intended for a 3:2 aspect ratio thank you wikipedia whereas my TV can be set only to 16:9 or 4:3. I suppose I should have settled for one of them but I continued to poke and prod and fiddle and fumble with settings like a horse attempting hand surgery, and now both DVI inputs yield naught but the disheartening resolution of 1024x768 and it's vile lackey, 800x600.
If I can't figure this out, then I suppose I'll just procure a SNES or something to make the TV useful, and return to my more modestly sized monitor. But perhaps, with the help of the hive mind, my nerdy and over-sized ambitions can be realized.
Thanks kindly.
heres the product description
It claims that the native resolution is 1366 x 768. I tried putting those figures in the custom resolution thingy, but it didn't accept it either.
Could it have anything to do with the set frame refresh rate? I have it at 75hz.
posted by sarastro at 7:54 PM on February 28, 2011
It claims that the native resolution is 1366 x 768. I tried putting those figures in the custom resolution thingy, but it didn't accept it either.
Could it have anything to do with the set frame refresh rate? I have it at 75hz.
posted by sarastro at 7:54 PM on February 28, 2011
Powerstrip will let you do magic with resolutions / custom resolutions etc. Runs great on Windows 7 Home Premium x64 for me, with a DVI-DVI cable to my TV.
Please note that this tv could only be 720p, which is 1280x720 max. Google the model number of the TV to find out.
posted by defcom1 at 7:54 PM on February 28, 2011
Please note that this tv could only be 720p, which is 1280x720 max. Google the model number of the TV to find out.
posted by defcom1 at 7:54 PM on February 28, 2011
Oh, resolution should be 60Hz, if it's a standard LCD panel. Powerstrip will let you set the output to 1366x768
posted by defcom1 at 7:55 PM on February 28, 2011
posted by defcom1 at 7:55 PM on February 28, 2011
Alas, installing powerstrip didn't help. Even after selecting the 1366/768 setting and restarting the computer, it's not available on the selection list, and I'm still stuck at 1024x768
posted by sarastro at 8:26 PM on February 28, 2011
posted by sarastro at 8:26 PM on February 28, 2011
I confess I don't know what it means to enable overscan. Can you elaborate?
Also, for some reason now I can no longer adjust my frame refresh rate (its stuck at 60) and the image from certain things (like games on steam) is now horribly grainy and oversharpened. Not quite sure what's going on with that.
i think i'll just start downloading ubuntu now
posted by sarastro at 9:04 PM on February 28, 2011
Also, for some reason now I can no longer adjust my frame refresh rate (its stuck at 60) and the image from certain things (like games on steam) is now horribly grainy and oversharpened. Not quite sure what's going on with that.
i think i'll just start downloading ubuntu now
posted by sarastro at 9:04 PM on February 28, 2011
addendum: I know enough about what I'm doing to know that "native" indicates something important, but I'm not yet quite sure what
LCDs don't work the same way as CRTs (tube-type monitors). In many ways the old-fashioned tube monitors were actually better than LCDs, and this is one of them: CRTs don't have native resolutions, LCDs do. What's native? Native means hard-wired. LCDs are physically constructed with a specific resolution in mind. You might be able to run in a different resolution, but it won't look nearly as sharp. With LCDs, you basically are stuck having to watch them at their native resolution for optimum quality.
The reason your video card isn't working is because either it thinks it's connected to a computer monitor instead of a television, OR your television thinks it's being used as a computer monitor, and is defaulting to a "standard" computer monitor resolution. You need to be 1366x768 @ 60p. The refresh rate should be whatever the default is; LCD panels are typically 60 or 75 Hz. It doesn't matter like it does with a CRT because LCDs don't "refresh" in the same way (there's no electron beam scanning up and down the panel).
I think the problem is probably with your TV: see if you can force a video mode instead of letting it auto-detect.
posted by Civil_Disobedient at 10:32 PM on February 28, 2011
LCDs don't work the same way as CRTs (tube-type monitors). In many ways the old-fashioned tube monitors were actually better than LCDs, and this is one of them: CRTs don't have native resolutions, LCDs do. What's native? Native means hard-wired. LCDs are physically constructed with a specific resolution in mind. You might be able to run in a different resolution, but it won't look nearly as sharp. With LCDs, you basically are stuck having to watch them at their native resolution for optimum quality.
The reason your video card isn't working is because either it thinks it's connected to a computer monitor instead of a television, OR your television thinks it's being used as a computer monitor, and is defaulting to a "standard" computer monitor resolution. You need to be 1366x768 @ 60p. The refresh rate should be whatever the default is; LCD panels are typically 60 or 75 Hz. It doesn't matter like it does with a CRT because LCDs don't "refresh" in the same way (there's no electron beam scanning up and down the panel).
I think the problem is probably with your TV: see if you can force a video mode instead of letting it auto-detect.
posted by Civil_Disobedient at 10:32 PM on February 28, 2011
The product description doesn't list VGA or DVI inputs. How are you plugging the computer into the TV? Through DVI -> HDMI?
posted by qxntpqbbbqxl at 11:29 PM on February 28, 2011
posted by qxntpqbbbqxl at 11:29 PM on February 28, 2011
The device tells the graphics card what resolutions it supports. That's why when you switched plugs suddenly every resolution was enabled, because there was no longer anything connected to that port telling the graphics card what it can accept, so it defaults to everything. If the only thing on the list is 1024x768 then that's because that's what the display is reporting. You need to mess about in the buttons/menus of the display; the problem is not at the computer end. Try pressing the RATIO button on the remote to force 16:9 and then see if 1366x768 is offered.
qxntpqbbbqxl, the OP says he's using a DVI-HDMI cable in the post.
posted by Rhomboid at 11:50 PM on February 28, 2011
qxntpqbbbqxl, the OP says he's using a DVI-HDMI cable in the post.
posted by Rhomboid at 11:50 PM on February 28, 2011
Being a TV, I would suspect that it might only accept the TV style resolutions. (Just looked at the link where it says "Compatibility" and shows only those resolutions.) See if you can download an owner's manual, it might give you better instructions.
As explained above, LCDs, Plasma and DLP displays, have "native" resolutions. That is how many different dots it can create.
(CRTs have this too: when you look at a CRT up close, you see little red, green and blue dots. The scanner gun needs to line up with those dots. A black and white TV doesn't have those dots, the gun can literally point anywhere and create a dot. But a color CRT has to make the gun hit a green dot when it is looking for a green dot. So they have native resolutions also. There's also a limit as to how fast and how granularly (that's not a word) the gun can resolve dots. It happens in a slightly different way, but it does exist.)
Anyway, it is awesome if the dots sent out by the source map perfectly with how many dots there are on the display. But that almost always doesn't happen. So the devices all have translators that accept different resolutions and convert them to the one the machine can actually display. Sometimes what should be a perfectly good picture gets good and messed up because it has gone through one too many conversion.
Out of cheapness or simplicity, these will be different between manufacturers, model#s, etc. They may not have included a way for the device to accept the native resolution because no cable box, DVD player or broadcast standard outputs that resolution. And a TV can't expect one of those devices to be able to change what it outputs.
(Similarly, a computer monitor might not be expecting to be used as a TV, so it might not accept TV resolutions, but accept all manner of weird computer resolutions.)
On the computer, what does it say for what kind of device it is? Does it say "LG Digital TV" or something like that, or "generic display"? You might have to force that to change to allow the graphics card to output what you want.
Then set it at 1080 (1920x1080) or 720 (1280x720). If the TV manual doesn't tell you how to enable its native resolution.
Another definition: overscan. With old school CRT displays, it is very hard to get the electron gun to make crisp edges. (Think about a cheap computer monitor when you try to adjust the screen image right up to the edges- the image starts to get weird looking.) So to overcome this, they set the guns to "paint" the image a little wider than what is actually viewable, so that the viewable image is missing a little bit of the edges, but it isn't ugly looking. Most television programs were shot to allow for this- think about watching sports broadcasts or news ticker things, where they have information right at the edges of the screen. On some TVs, they are partially cut off, on other screens, you can actually see extra stuff beyond them. That's overscan.
Another complication is that broadcasters know this is happening, and started using the top and bottom lines for other stuff- closed captioning is one good example. If you ever watched a recording of an NTSC broadcast that was made on a computer, you might see little black and white dots and dashes at the top and bottom. That's the CC information. It is a limitation of the NTSC broadcast signal- it was invented before computers, and it has to be dead simple. It is just an analog wave that commands the guns in the TV. It's just line, line, line, line, repeat. So they had to cram in extra info into the lines somehow.
And sometimes, the things being broadcast don't always line up exactly right onto the full area of the image. If a video recorder or a camera isn't adjusted completely perfectly, its image might be a little smaller, larger, or off to one side or another.
So overscan both corrects for some things, and allows for others.
Anyhow, modern flat panel TVs sometimes have an overscan option to "fake up" overscan so some viewers aren't annoyed by these "defects". What this amounts to, however, is that when you hook up a computer, a bunch of stuff on the edges becomes invisible. That's why they (might) have an option to turn it on or off.
posted by gjc at 6:05 AM on March 1, 2011
As explained above, LCDs, Plasma and DLP displays, have "native" resolutions. That is how many different dots it can create.
(CRTs have this too: when you look at a CRT up close, you see little red, green and blue dots. The scanner gun needs to line up with those dots. A black and white TV doesn't have those dots, the gun can literally point anywhere and create a dot. But a color CRT has to make the gun hit a green dot when it is looking for a green dot. So they have native resolutions also. There's also a limit as to how fast and how granularly (that's not a word) the gun can resolve dots. It happens in a slightly different way, but it does exist.)
Anyway, it is awesome if the dots sent out by the source map perfectly with how many dots there are on the display. But that almost always doesn't happen. So the devices all have translators that accept different resolutions and convert them to the one the machine can actually display. Sometimes what should be a perfectly good picture gets good and messed up because it has gone through one too many conversion.
Out of cheapness or simplicity, these will be different between manufacturers, model#s, etc. They may not have included a way for the device to accept the native resolution because no cable box, DVD player or broadcast standard outputs that resolution. And a TV can't expect one of those devices to be able to change what it outputs.
(Similarly, a computer monitor might not be expecting to be used as a TV, so it might not accept TV resolutions, but accept all manner of weird computer resolutions.)
On the computer, what does it say for what kind of device it is? Does it say "LG Digital TV" or something like that, or "generic display"? You might have to force that to change to allow the graphics card to output what you want.
Then set it at 1080 (1920x1080) or 720 (1280x720). If the TV manual doesn't tell you how to enable its native resolution.
Another definition: overscan. With old school CRT displays, it is very hard to get the electron gun to make crisp edges. (Think about a cheap computer monitor when you try to adjust the screen image right up to the edges- the image starts to get weird looking.) So to overcome this, they set the guns to "paint" the image a little wider than what is actually viewable, so that the viewable image is missing a little bit of the edges, but it isn't ugly looking. Most television programs were shot to allow for this- think about watching sports broadcasts or news ticker things, where they have information right at the edges of the screen. On some TVs, they are partially cut off, on other screens, you can actually see extra stuff beyond them. That's overscan.
Another complication is that broadcasters know this is happening, and started using the top and bottom lines for other stuff- closed captioning is one good example. If you ever watched a recording of an NTSC broadcast that was made on a computer, you might see little black and white dots and dashes at the top and bottom. That's the CC information. It is a limitation of the NTSC broadcast signal- it was invented before computers, and it has to be dead simple. It is just an analog wave that commands the guns in the TV. It's just line, line, line, line, repeat. So they had to cram in extra info into the lines somehow.
And sometimes, the things being broadcast don't always line up exactly right onto the full area of the image. If a video recorder or a camera isn't adjusted completely perfectly, its image might be a little smaller, larger, or off to one side or another.
So overscan both corrects for some things, and allows for others.
Anyhow, modern flat panel TVs sometimes have an overscan option to "fake up" overscan so some viewers aren't annoyed by these "defects". What this amounts to, however, is that when you hook up a computer, a bunch of stuff on the edges becomes invisible. That's why they (might) have an option to turn it on or off.
posted by gjc at 6:05 AM on March 1, 2011
I've ran into this sort of issue. Basically, using DVI-to-HDMI as you're doing, the television will only accept a certain range of resolutions. Barring screwing with the television firmware, you're probably stuck with a select few options.
You may be able to get it to do 720p resolution if you use a video card with HDMI output, but the television is too low-res to do 1080, so you're probably going to do the best you can with 1024x768.
FWIW, I had this same issue with a 32" LG television probably around four or five years ago. If yours is that vintage, there you go.
posted by mikeh at 1:00 PM on March 1, 2011
You may be able to get it to do 720p resolution if you use a video card with HDMI output, but the television is too low-res to do 1080, so you're probably going to do the best you can with 1024x768.
FWIW, I had this same issue with a 32" LG television probably around four or five years ago. If yours is that vintage, there you go.
posted by mikeh at 1:00 PM on March 1, 2011
Thanks for your help everyone, I ended up just cutting my losses and setting up the TV as a secondary monitor for movies and such. That seems to be working out just fine.
posted by sarastro at 4:47 PM on March 1, 2011
posted by sarastro at 4:47 PM on March 1, 2011
But a color CRT has to make the gun hit a green dot when it is looking for a green dot.
Technically, that's dot pitch. Dot pitch != resolution.
posted by Civil_Disobedient at 6:25 PM on March 1, 2011
Technically, that's dot pitch. Dot pitch != resolution.
posted by Civil_Disobedient at 6:25 PM on March 1, 2011
This thread is closed to new comments.
Why do you feel this? Could you supply the model number of the TV in question?
posted by pompomtom at 7:49 PM on February 28, 2011