New monitor, looks poopie.
May 6, 2010 7:27 PM   Subscribe

My beautiful CRT finally died today. The replacement LCD is NOT plug and play. Help!

Until today, I had a beautiful CRT monitor, but it finally gave up the ghost. Went out, bought me a Compaq S2022a LCD. It's supposed to be 1600x900 resolution. Got it home, set it up, plugged it in. Ouch...

I installed the drivers on my Dell Optiplex GX520. The docs that came with it said the resolution should be set to 1600x900, sadly the highest selection on the Display-Settings screen is 1280x768. At this setting the video is letterboxed with black bars on the top and bottom. Setting it to 1280x720 gets rid of the bars, but is smaller than the CRT res of 1280x1024.

I tried updating the monitor and video drivers, no luck.

The Display-Settings screen will let me set up to 1920x1200 if I select Default monitor, but if I try to change the Default monitor settings, they just revert to the S2022a 1024x786 settings.

Is far as I can tell, the video chip set (Intel 82945G Express) should support the monitor's 1600x900 native resolution, how can I get there from here?
posted by Marky to Computers & Internet (9 answers total) 2 users marked this as a favorite
 
Are you still using a VGA cable or did you use the HDMI input? VGA has a lot broader selection of display modes available, HDMI is really only designed for the standard "HiDef" television modes.
posted by AndrewStephens at 7:44 PM on May 6, 2010


Best answer: Does this page help?
http://support.microsoft.com/kb/309569

Basically, look for the "Hide Modes that this monitor cannot display" checkbox in your graphics settings.
posted by jsmith77 at 7:47 PM on May 6, 2010 [1 favorite]


Did you install drivers for the video card, or just something on a cd that came with the monitor? These may be separate things (like some monitor adjuster taskbar thingy) and you may need to update specifically your video card driver.
posted by Big_B at 8:23 PM on May 6, 2010


Best answer: Non-standard resolutions like 1600x900 are a PITA to run over VGA - if your video card has DVI output, use it, it will automatically detect the resolution of your monitor. If not, I would highly recommend picking up a cheap video card with DVI. Picture quality will also improve.
posted by wongcorgi at 8:41 PM on May 6, 2010


Best answer: Older video cards tend not to support wide-screen resolutions, you might have to buy a new card.
posted by jjb at 9:07 PM on May 6, 2010


You shouldn't be using the analog connection anyway, digital will look a lot better.
posted by delmoi at 12:06 AM on May 7, 2010


Best answer: Since you haven't gotten much in the way of explanation, here is a little extra information.

1. CRTs typically accept analog input, which means the blue, 15 pin VGA cable that you are familiar with. VGA signals are actually voltages fluctuating up and down, much like audio signals.

2. The output of a CRT is similarly analog -- an electron beam lights up each phosphor and makes it glow. There are a fixed number of phosphor dots on the screen, but they are very fine and in a way, a bit more flexible, resolution-wise, than an LCD. It doesn't look as bad if you pick a non-native resolution on a CRT.

3. An LCD (the panel itself) is a digital device. It only accepts digital signals. Analog signals, like VGA input, must be translated somehow. An LCD which only has VGA input is not giving you full access to the LCD panel itself. You signal is being massaged somewhere inside the display. DVI cables, the large, usually white cables that have this kind of connector, are the most "native" interface to an LCD presently available for desktop computers. If your LCD does not have this kind of connector, take it back.

4. Driving an LCD at its native resolution (meaning the number of pixels actually present in the panel) using DVI as your interface, gives you direct digital control over each pixel. This is the best possible experience, with no smearing, stretching, cropping, or other issues. Trust me, it is absolutely worth it.

5. To drive your LCD this way, first, get a DVI cable and connect it to both your monitor and tower. If your LCD doesn't support DVI, take it back. Really. If your tower doesn't support it, you can get DVI capable video cards for almost nothing. You can get them used on eBay for $20 shipped, easy. Personally, I prefer Nvidia cards because I like how their drivers work. If your computer is a little old, you might even consider asking gamer nerd friends, because they often have boxes of video cards laying around.

6. With the two connected, over DVI, your monitor should actually auto-configure itself to the correct resolution. However, if it doesn't, you should be able to select the correct mode manually without trouble. The monitor actually reports its supported modes over the cable.

7. Just because your chipset supports some resolution/refresh rate combo doesn't mean that the driver does, I think that's the problem you're facing now.

Please let me know if this was helpful. I know it is somewhat technical, but to really understand what's going on, it's good to have a grasp of the underlying technologies that are frustrating your desired outcome.
posted by fake at 6:10 AM on May 7, 2010


Response by poster: Ah, I sorta guessed but you all confirmed... time for a new video card. Thanks everyone!
posted by Marky at 9:30 AM on May 7, 2010


Have you looked at updating to the latest intel drivers? Here is a link to the latest XP drivers, I think. Thinking back, I can see a GX520 being a bit long in the tooth with the original Dell-supplied drivers... A discrete video card will help with the speed, but at least you could limp along.
posted by arrjay at 9:40 AM on May 7, 2010


« Older Jiminy Jillickers, will my friend develop...   |   Teaching grad school AND K-12? Possible? Newer »
This thread is closed to new comments.